Contribution by: Kudzayi Chipidza who is non-executive director on the IITPSA Board of Directors, member of the IITPSA Social & Ethics Committee, Professional Member of the IITPSA and a Cloud Support Engineer for one of the leading international Cloud Service Providers.

Kudzayi Chipidza’s sceptical take on brain computer interface chips and how regulation is seemingly having to catch-up with trends in technology innovation.

A few weeks ago, after a long day’s work, I sat down on a park bench to relax, to ‘get away from it all’. I made a conscious decision not to think about work…just anything else but work. Much as I tried, figments of pending tasks kept coming to the fore of my mind. Ideas on how to tackle a vexing problem that I had encountered earlier in the day. Each of those ideas felt plausible and worth considering as a solution.

An African sacred ibis or Hamerkop, a short distance to my left, on the banks of a small pond, kept letting out a loud shrill intermittently. It was almost as though its cry served as an interrupter to my wandering thoughts — a reminder that I ought to clear out my work thoughts and rest a while. At that moment, I wondered if I could save all these ponderings, if I could shelve them, put them away somewhere before I forgot. In an effort to self-regulate, I had deliberately left my phone behind so I could not record a voice note.

A while later, as I got up from the park bench, unbeknownst to me, a neurotechnology company, Neuralink, had just announced it had successfully completed its first in-human clinical trial. They had used a robot to surgically implant a brain computer interface (BCI) chip into a human skull and called it ‘The Link’. It gave the recipient the ability to control a computer or phone using just their thoughts. When rolled out to the consumers, the product would be aptly named “Telepathy”. So by just thinking about an idea, the user could then jot it down on a file or reminder and save it. They could schedule meetings or reply to emails as they pondered about them. Clearly, the possibilities of this innovation are endless.

For now, this innovation is targeted to aid people living with disabilities, particularly quadriplegia, an inability to move any of the limbs.

On the flipside of this novel invention are the stark ethical implications. Perhaps one of the obvious ones is the mere mention of Elon Musk’s name(the CEO of Neuralink) and brain chips in the same sentence. It immediately sets off alarm bells of data privacy, safety and security. Just as information stored on your laptop or mobile device, your thoughts are even more personal. With the advent of BCI technology, the vulnerabilities and inadequacies of the data privacy and security policies are now being extended to an aspect of human life which we cannot simply disconnect from. The ramifications of data loss or malfunction are amplified. This technological advancement now means there exists a physical, tangible and psychological attachment between human beings and connected computing devices.

The troubling fact about private tech companies developing these modern-day disrupting innovations is that their motives are often driven by profit and seldom by benevolence. The thrust is often placed on becoming the ‘leader’ in championing the ‘new’ technology rather than collaboration or devising sustainable guardrails for the wider society who use it. The world is witnessing it now with cloud service provision, Artificial Intelligence (AI), quantum computing, edge computing, internet of things (IoT) — how big tech companies are scrambling to assume the apex position in the market and paying lip service to data privacy, safety, sustainability and responsibility.

Furthermore, the proliferation of BCIs could impact society in some ways that are not easy to predict now. Unforeseen risks that develop as the technology is widely adopted by society. Much like social media at its inception, not many would have predicted how it would influence general elections in many sovereign nations, or the reported negative behavioural changes it would cause in youths. The IITPSA Code of Ethics emphasises the importance of public interest in Clause 3.1. That the public good should always be an explicit consideration at the forefront of technological development.

In all this, one of the major inconsistencies is that society looks to public entities like government task forces or international bodies to regulate these innovations, long after the fact.


IITPSA. (2021). IITPSA Code of Ethics. Retrieved from