Why Uganda is not ready for robots as employees and employers!

The Employment Act of Uganda (2006) defines an employee as any person who has entered into a contract of service or an apprenticeship contract, including but without limitation to any person who is employed by or for the Government of Uganda, including the Uganda Public Service, a local authority or a parastatal organization, excluding a member of the Uganda People’s Defense Forces. (Section 2). In the same section, a contract of service is referred to as any contract, oral or written, whether express or implied, when a person agrees in return for remuneration to work for an employer and this includes a contract of apprenticeship. Further, Section 3 provides for a number of rights and obligations ranging from leave, payment of wages, the right against forced labour among others and explicitly states that these rights and benefits apply to workers classified as employees under a contract of service.

The implication of this is that for one to operate under the current legal framework as an employer or employee, an employer-employee relationship is sparked off and regulated by contract that can rightly be termed as an employment contract. Important to note is the fact that employment contracts, just like other contracts are governed by the fundamental principles of contract law such as the essentials of offer and acceptance and the capacity to contract. The legal framework in the Contracts Act (2010), the Employment Act (supra) and other pieces of legislation governing employment relations have been designed to cater for human labour, having regard for human traits and capabilities.

However, with the advent of the Fourth Industrial Revolution, the rise of machines in the workplace has well and truly started. Data from the International Federation of Robotics reveals that the pace of industrial automation is accelerating across much of the developed world with 74 installed industrial robots per 10,000 employees globally in 2016. A year later, that increased to 85 across the manufacturing sector. Europe has a robot density of 106 units per 10,000 workers and that number is 91 and 75 in the Americas and Asia respectively. China is one of the countries recording the highest growth levels in industrial automation but nowhere has a robot density like South Korea. (Niall McCarthy, Data Journalist, Statista; 01 May, 2019).With such projections, one ponders on the feasibility and practicability to engage robots in the workplace as employees in a third world economy like Uganda where the unemployment statistics are already way high at an estimated percentage of1.92% as of 2020.

What is a robot?

Bertolini broadly defines a robot as a machine which may either be provided of a physical body, allowing it to interact with the external world, or rather having an intangible nature -such as a software program, which in its functioning is alternatively directly controlled or simply supervised by a human being, or may even act autonomously in order to perform tasks, which present different degrees of complexities (repetitive or not) and may entail the adoption of not pre-determined choices among possible alternatives, yet aimed at attaining a result or provide information for further judgment, as so determined by its user, creator or programmer, including but not limited to the modification of the external environment, and which in so doing may interact and cooperate with humans in various forms and degrees. (Bertolini (2013), p.219).

As the case always is with new technological developments, there is a fear of the negative consequences and the fear of success or failure of it. There is also a tendency to accentuate the negative aspects of new technology when its consequences are not fully understood. As Sir Arthur Clarke states in his third law, any sufficiently advanced technology is indistinguishable from magic and magic is incomprehensible and therefore dangerous. (Profiles of the Future: An inquiry into the Limits of the Possible, 1973). Considering the fact that employment contract are at the fulcrum of employment, fundamental questions arise as to whether in light of Uganda’s legal framework, robots are vested with the capacity to contract, whether with a separate sui generis legal framework would be required regarding contracting with robots or better yet whether the current laws can be re-designed in a manner that caters for robots working alongside humans, and whether legal personality can be bestowed upon them to incur liability where the contract is breached!

Capacity to Contract.

Sir George Jessel M.R in Numerical Printing Company V Simpson (1875) holds that men of full age and competent understanding possess the capacity to contract. This is fortified by Section 11 of the Contracts Act 2010 which states that a person has the capacity to contract if the person is of eighteen years and above, of sound mind, among other considers. The opinion of Sir George Jessel points to two main aspects for close consideration:

1. Full Age

For Human beings, the Contracts Act stipulates the contracting age as eighteen years and above or 16 years and above as provided for under Article 34(5) of the Constitution of the Republic of Uganda (1995). Arguably, the rationale consideration of this as the full age could be that at this age, one possess sufficient mental and intellectual intelligence to accurately analyze situations and make an informed decision on a given subject matter as regards offer, acceptance and consideration, and have the capacity to bear the consequences arising out of liability. This also applies for contracts of employment. As an equivalent of growth and age for human beings, research has shown that through deep learning, a machine based kind of learning based on a set of algorithms that attempt to model high level abstractions in data, machines, unlike humans, are connected the whole time and if one machine makes a mistake, all autonomous systems will keep this in mind and will avoid the same mistake the next time. Through self-learning, it is said that machines, robots inclusive, are able to optimize their own behavior on the basis on the basis of their former behavior and experience. (Gerlind et al; Artificial Intelligence and Robotics and Their Impact on the Workplace, 2017, p.10).

Age eligibility to contract (currently 18) is incompatible to robots.

This basically means that machines have the ability to grow and evolve intellectually faster as opposed to human beings and as such might not require 16-18 years to attain full age to contract. It can then be argued that depending on the content in the contract of employment especially if it contains variable, subjective or open clauses, depending on the work done and complexity of tasks carried out, legislators could study and gauge the minimum age at which a robot can contract. This however becomes more complex if one considers that different robots have different manufacturers and hence a different compounding of algorithms applies to each, and as such, may intellectually mature at different time spaces and intervals. It should also be taken into consideration that tasks and and assignments differ from one workplace to another and as and so are the mistakes likely to be made, which implies that self or machine learning towards a particular required intelligence differs in the time taken to achieve it. Even if employers signed probationary contracts with robots allowing them time to prove their maturity to full “age” as a precursor to signing the main contract, failed (as any machine might sometimes do) to reach the desired intelligence time-wise? Would the employer then classify that as breach of the probationary contract and therefore terminate the robot employee? As tasks change and become more complex, employees are required to easily adapt and taking longer periods might be detrimental to the business.

It would require legislators, in an effort to find a standard age of employment, to consider the probable age of maturity (full age), from the time of manufacture, for which robots can contract as adults. The issue arises in two circumstances. First, only the manufacturer can tell at what speed a robot is likely to intellectually grow or evolve. To whom then would liability fall in case of failure to evolve? Would if lie with the manufacturer or can robots be given legal personality, or better still, can the liability be apportioned according to that which the robot is able and justified to bear according to their ability of function at that particular time?

Second, where robots are working alongside human beings in the workplace, one might be careful not to occasion any form of discrimination by giving preferential treatment to robots in the aspect of the required equivalent age to contract in the workplace and incurring liability arising therefrom. The 1998 ILO Declaration on Fundamental Principles and Rights at Work calls on all member states to promote and realize within their territories the right to be free from discriminatory employment practices. This is re-inforced in Section 6 of the Employment Act. (supra)

2. Competent Understanding.

As of whether robots possess sufficient intelligence to be able to understand and execute a contract, to make an offer, to accept an offer, to freely negotiate terms even within the course of employment, most robots run on algorithms. When algorithms become more complex and are compounded, they can actually be viewed as intelligence (Artificial Intelligence). However, an algorithm is developed to the end that it can achieve a particular result. Competent understanding speaks to the ability to assess a situation and judge it basing on whether on one’s experience to judge whether it is applicable or good for you. When one is contracting, they contract in their own best interest. The idea that a human being can contract with a robot becomes difficult because if a robot is programmed to a particular end, then it has not analyzed the entire prospect or situation to see which solution would be most applicable to them. An algorithm therefore amounts to representation of someone else’s mind. On whether this then does not amount to coercion and duress by the manufacturer or programmer is an argument that is important to have.

Can algorithms be considered as intelligence or sophisticated instructions?

Free will is essential in contracting. Philosophers tend to agree that consciousness is necessary for free will. Eddy Nahamias in Chapter 3 of “When Do robots have Free Will? Exploring the Relationship between Attributions of Consciousness and Free Will”, looks at whether consciousness causes free will suggests that free will requires that one’s actions properly derive from reasons for actions that one has at some point consciously considered or at least would accept if one considered them. Quoting Randy Clarke, he advances the point of view that free will requires a capacity for rational self-determination and that a free agent must be able to exercise this capacity consciously. (2003, Libertarian Accounts of Free Will. New Yorker: Oxford University Press). Would robots pass this consciousness test? Regarding freedom of contract, Jean Paul Starte (1943) suggests that being conscious or perhaps self-conscious necessarily makes one radically free. This suggests that the first-person experience of having open alternatives for future choices is essential for possessing free will.

Do we see a future where robots can meet this test? More importantly to consider, where the meeting of minds (consensus ad idem) is required at the time of contracting, in light of the highlighted differences between humans and robots, is the meeting of minds possible in view of the requirement of a establishing a common understanding of the contract?

Conclusively, with all the above questions, it becomes difficult to tell whether a robot can be classified as a legal person having rights, duties and obligations in the law, separate from its manufacturer or programmer and consequently, introducing it into the workspace and giving it dominance as an employer or responsibility as an employee without the certainty of having itincur liability is no safe venture.