A Crash Course On Brain-Computer Interfaces

To some people, the idea of a brain-computer interface may seem alarming, if not scary.

There are a lot of dystopian ideas associated with these devices, such as that people will be able to control your mind, telepathy, A.I. will take over your brains, you’ll become a cyborg.

Although brain-comper interfaces don’t currently serve these purposes, maybe one day they will be able to do such futuristic things.

In this article, I’ll give you a brief insight into the purposes of brain-computer interfaces, and how they are being implemented today to change our future.

Before we dive into brain-computer interfaces, we need to briefly understand how the brain communicates.

If someone throws a baseball towards you, you’ll most likely sprint towards it and catch the ball. If you’re really extra you might dive and land on your hands and knees to ensure you catch it.

You’re upstairs in your room and your father calls your name and asks you to come downstairs to do a task most of us dread, doing the laundry. You rush down before your parents get annoyed and begin to lecture you about punctuality.

So, how do we know how to do these tasks?

Essentially our brain is the master organ of our body, and it is made up of billions of cells referred to as neurons.

These neurons carry information that consists of electrical pulses, and essentially whenever you do anything, neurons in your brain fire off. A neuron has branches that look like a tree, and they are called dendrites. The dendrites receive signals, and a longer projection that looks like a tree trunk, called the axon, sends signals.

These electrical pulses can jump from neurons to neurons through the use of the nerve releasing chemical signals called neurotransmitters.

At the end of the axons, we can find synapses which act as a channel for neurons to communicate. The neuron transmitters will then use the synapses as a tunnel to travel from different neurons, creating new electrical waves where it travels.


Within brain-computer interfaces, electrodes are strategically planted to measure and monitor brain communication and perform tasks based on certain criteria.

So, what is a brain-computer interface?

Essentially a brain-computer interface (otherwise known as a BCI) is a system that allows communication between the brain and machines.

To put things simply, they work in three main steps:

  1. Collect the brain signals
  2. Interpret the brain signals
  3. Output commands to a machine according to the brain signals that were monitored

BCIs can be differentiated into three separate categories:

  1. Non-invasive
  2. Semi-invasive
  3. Invasive

This is what a non-invasive BCI looks like:

As you can see, the electrodes are placed on the top of the scalp and they are measuring the electrical potentials being produced by the brain or magnetic field.

Non-invasive BCIs are able to read brain signals without the use of surgery and have the ability to control the user’s external environment.

Examples of the capabilities of non-invasive BCIs being used are for gaming, controlling robotics, and even prosthetics.

A semi-invasive BCI looks like this:

The electrodes are placed on the exposed surface of the brain.

This practice can be referred to as electrocorticography (ECoG), and this is the direct recording of electrical potentials associated with the cerebral cortex of our brain. The ECoG signal is captured from the electrodes which are placed in either the dura or in the arachnoid section of our brain.

Invasive BCIs are implanted deep into the brain during neurosurgery procedures.

The devices can consist of single-units that detect the signals from a specific area of brain cells or multi-units that detect the signals from multiple areas of brain cells.

Now that we’ve explored the three main types of BCIs that are used let’s take a look at the actual real-life purposes of these intricate devices.

A big use case of BCIs and a field that will be a huge focus in the future of these interfaces include restoring impaired functions of the human body.

Let’s remember that BCIs recognize the intent of an individual through brain signals, then decode the neural activity, and translate it into an output to accomplish a specific goal.

As a result of this process, BCIs are being developed to control prosthetics using non-invasive EEGs.

At the University of Houston, researchers were able to create an algorithm that allowing an individual to grasp a bottle with the use of a prosthetic controlled by a brain-computer interface.

Companies such as NeuroPace and Neuralink are using these invasive brain-computer interfaces to target spinal and brain issues that many people on the daily have to face.

Neuralink has been developing the first stages of a surgically implanted brain chip that measures our brain activity.

The chip referred to as the Link, is being developed to do cure medical illnesses such as paralysis, mental illness, epilepsy, memory loss, and increase hearing ranges. Currently, the chip has proven to detect and function within pigs and Neuralink is planning on improving the device so it can be put on the market for consumers to purchase.

The future stages of the chip plan to accomplish futuristic tasks such as being able to download skills into your brain, stream music in your head, and even telepathy.

If you are interested in reading more about Nueralink, check out this article I wrote:

The mission of NeuroPace is to target epilepsy in individuals. Essentially their device continuously monitors electrical brain activity and delivers stimulation pulses when the device predicts patterns correlated to seizure activity. With this amazing device, it’s possible to stop seizures before patients know that they are even having one.

Measuring and monitoring brain activity will allow the device to send pulses to the brain to stop seizures.

Brain-computer interfaces are the future of humanity. Currently, they are being developed to do amazing things within the medical community and areas of entertainment. With more research, time, and development, soon futuristic things we always hear about such as telepathy, mind-control, and downloading information into our heads will be possible to experience with these interfaces.

Contact me for any inquiries 🚀

Hi, I’m Ashley, a 16-year-old coding nerd and A.I. enthusiast!

I hope you enjoyed reading my article, and if you did, feel free to check out some of my other pieces on Medium :)

Articles you will like if you read this one:

💫 How I Made A.I. To Detect Rotten Produce Using a CNN

💫 Detecting Pneumonia Using CNNs In TensorFlow

If you have any questions, would like to learn more about me, or want resources for anything A.I. or programming related, you can contact me by:

💫Email: ashleycinquires@gmail.com

💫 Linkedin


Innovator and AI enthusiast

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store