BCI: Facts, Fiction, and Tom Cruise
The notion of brain computer interfaces is well-worn in literature and cinema. What happens when life imitates art?
The idea of a brain-computer interface (BCI) is well-worn in literature and cinema. Dumbledore’s pensieve and Harry’s subsequent access is a mechanism where, with perfect transcription fluency, the memory in one brain flows elegantly into another. Rowling conceives the term “legilimency,” playing with Latin and wizarding semantics, to describe the process of reading minds. In the process, she also broaches the topic of consent – Harry’s frenemy Snape is concerned that Voldemort might attempt to access Harry’s thoughts and assists Harry in building what amounts to a cognitive firewall.
In an entirely different genre, Tom Cruise’s role as David Aames in “Vanilla Sky” contains an example of a BCI where the protagonist is placed in a lucid dream state1 while his body is cryogenically frozen. Almost twenty years earlier, Christopher Walken starred in a film in which a system records the sensory and emotional experience of an individual, allowing those cognitive events to be experienced by another human being. The military gets involved, and again, the ethics of proper use are discussed and depicted.2 Putting aside philosophical discussions of consciousness and free will, these are other examples of brain computer interfaces.
Our lives are increasingly enmeshed and entangled with technology. Don’t believe me? I’ll bet you can remember your childhood phone number with minimal effort. Now try to remember any phone number you’ve learned in the past five years. I’ll wait. The tools we employ to convey our intention to that technology (keyboards, mice, smartphone touch screens, Siri, etc) are as fundamental to our lives as shirts and shoes.3 What if our thoughts were sufficient to communicate intention to a machine and receive a response? That technology currently resides on the boundary separating science fiction from science fact.4
Perhaps this technology would function more effectively for human beings with hands. The proportion of the motor cortex responsible for the motion of a body part shrinks (but does not vanish) if that body part is missing or removed. Even so, it is on this front where the value of BCI technology will create almost unprecedented economic value. Typing is an inefficient action. The QWERTY keyboard, though not actually designed to slow down typists as is often assumed, was constructed to avoid typewriter jams.5 Pointing and clicking with a mouse is an inefficient action. If you don’t believe me, imagine using a trackpad to move the fork back and forth between your plate and your mouth. It’d be the best weight-loss program ever devised. Locating the button on a monitor, reading the fonts, analyzing color patterns, navigating drop downs, while pummeled by pop-ups and auditory alerts is assuredly inefficient. What if your thoughts (and the neurological data they generated) were sufficient to eliminate the keyboard, the mouse, the monitor, and the inefficiency?
Let’s try a little back-of-the-envelope math:
There are approximately 8 billion humans walking the planet.
Of those, perhaps half (~4 billion) are of working age.
Of those, perhaps half (~2 billion) are sitting at a computer for a living or staring at a smartphone, or otherwise attached to their technology (or will be by 2030).6
Let’s assume that they generate $20/hr in economic value.7
Now imagine that scrapping the inefficient devices returned 30 minutes (0.5 hrs) of productivity to each workday.8
Let’s do some multiplication:
2 billion humans *$20/hr *0.5 hrs/day *250 workdays/yr ~ $5 trillion/yr
That’s roughly the GDP of Japan, which happens to be the world’s third-largest economy.
Technology that offers that type of economic potential will be developed. What if, during your next meeting, you could simply make a note for yourself to pick up that mango chutney you like from Trader Joe’s without the distraction of surreptitiously pulling up some app on your laptop, then jotting down the relevant thought, then attempting to refocus upon the meeting’s speaker, then trying to recall where you wrote that note down, then feeling guilty about missing some of the content of the meeting while some sector of your brain was trying to plan dinner? What if your next Google search was as simple as thinking about the question to which an answer was required? What human potential would that unlock?
But this poses significant ethical questions beyond the obvious technical challenges. Will the corporate behemoths who interrupt surfing the web with advertisements now interrupt your thoughts? Will the state have access to your prurient impulses and politically incorrect opinions? Will your spouse? When those trillions of economic productivity emerge, who will realize the gains? Will you complete your professional responsibilities in 6 hours rather than 8? Doubtful (see: pagers, blackberries, emails, shared file systems, etc). Will you receive greater compensation as a result of your increased productivity? Possibly (depending on if you have any ownership interest in your employer or other tech-enabled entities). Will you have the ability to utilize the technology for your own interests while avoiding some big-brother-esque employer “legilimency?”
If developed responsibly, thoughtfully, and with principles of autonomy and agency at its core, a brighter future awaits. In this future, the paralyzed regain functionality, people with neurological disorders receive compassionate, brain-specific care, those suffering from depression are treated more effectively, and the digital worker’s productivity grows exponentially. And everyone retains their dignity, privacy, and sovereignty.
1 He also hooks up with Penelope Cruz and Cameron Diaz whilst in this state I suppose if humans obtain the capacity to simulate events that are ultimately experienced as real, the phrase “sex sells” will assume an entirely new meaning.
2 Torture, brainwashing, psychotic experiences from accidental viewing of traumatic episodes, and other common elements in dramatic representations of the military-industrial complex.
3 Actually, given the work from home shift in recent years and the recent holidays, it seems the sartorial requirements for modern employment are far more lenient than those on the technological side.
4 A paralyzed human thinks about the motion of the hand that would have written letters. An algorithm decodes the letters intended from motor cortex activity via recurrent neural network. Another algorithm can then autocomplete words and sentences, proofread to smooth the inevitable errors, and so on. There are already published manuscripts addressing this topic.
6 There may be fewer than 50% of all employed persons engaged in computer-based work considering the number of agrarians, laborers, service professionals, etc…However, there are also quite a few humans, both students and retired folks who are attached to technology throughout the day, so I’ll stand by the rough heuristic.
7 This is stunningly conservative, given that full-time employment is roughly 2,000hrs/year, which would then imply $40,000 of economic productivity, and of course, if a knowledge worker is paid $100K+, he or she had better be worth considerably more to their employer. Of course, not everyone is a knowledge worker in the western world, not everyone’s employer deploys their aptitudes efficiently, etc. However, if you assume individual economic productivity is distributed according to a Pareto distribution (individual wealth certainly is), then even if most digital employees produce less than $20/hr (and I might dispute that), the mean is much higher, since Jeff Bezos and Warren Buffet count too. Moreover, if you consider the GDP/capita of a developed or developing nation, then assume roughly 2000hrs per person per year, then assume half the nation is in the labor force, you receive a figure somewhere around $20-$40/hr (as of 2010, and those figures have increased). Also, the digital labor force is most likely producing more per hour than the non-digital labor force.
8 If you thought the last one was conservative…sheesh. We can imagine doubling productivity with better capture of ideas and thoughts, but we can also imagine 10x or 100x productivity increases, at which point the economic strata discussed are off-the-charts!
No one works with an agency just because they have a clever blog. To work with my colleagues, who spend their days developing software that turns your MVP into an IPO, rather than writing blog posts, click here (Then you can spend your time reading our content from your yacht / pied-a-terre). If you can’t afford to build an app, you can always learn how to succeed in tech by reading other essays.
BCI: Facts, Fiction, and Tom Cruise
The notion of brain computer interfaces is well-worn in literature and cinema. What happens when life imitates art?
The idea of a brain-computer interface (BCI) is well-worn in literature and cinema. Dumbledore’s pensieve and Harry’s subsequent access is a mechanism where, with perfect transcription fluency, the memory in one brain flows elegantly into another. Rowling conceives the term “legilimency,” playing with Latin and wizarding semantics, to describe the process of reading minds. In the process, she also broaches the topic of consent – Harry’s frenemy Snape is concerned that Voldemort might attempt to access Harry’s thoughts and assists Harry in building what amounts to a cognitive firewall.
In an entirely different genre, Tom Cruise’s role as David Aames in “Vanilla Sky” contains an example of a BCI where the protagonist is placed in a lucid dream state1 while his body is cryogenically frozen. Almost twenty years earlier, Christopher Walken starred in a film in which a system records the sensory and emotional experience of an individual, allowing those cognitive events to be experienced by another human being. The military gets involved, and again, the ethics of proper use are discussed and depicted.2 Putting aside philosophical discussions of consciousness and free will, these are other examples of brain computer interfaces.
Our lives are increasingly enmeshed and entangled with technology. Don’t believe me? I’ll bet you can remember your childhood phone number with minimal effort. Now try to remember any phone number you’ve learned in the past five years. I’ll wait. The tools we employ to convey our intention to that technology (keyboards, mice, smartphone touch screens, Siri, etc) are as fundamental to our lives as shirts and shoes.3 What if our thoughts were sufficient to communicate intention to a machine and receive a response? That technology currently resides on the boundary separating science fiction from science fact.4
Perhaps this technology would function more effectively for human beings with hands. The proportion of the motor cortex responsible for the motion of a body part shrinks (but does not vanish) if that body part is missing or removed. Even so, it is on this front where the value of BCI technology will create almost unprecedented economic value. Typing is an inefficient action. The QWERTY keyboard, though not actually designed to slow down typists as is often assumed, was constructed to avoid typewriter jams.5 Pointing and clicking with a mouse is an inefficient action. If you don’t believe me, imagine using a trackpad to move the fork back and forth between your plate and your mouth. It’d be the best weight-loss program ever devised. Locating the button on a monitor, reading the fonts, analyzing color patterns, navigating drop downs, while pummeled by pop-ups and auditory alerts is assuredly inefficient. What if your thoughts (and the neurological data they generated) were sufficient to eliminate the keyboard, the mouse, the monitor, and the inefficiency?
Let’s try a little back-of-the-envelope math:
There are approximately 8 billion humans walking the planet.
Of those, perhaps half (~4 billion) are of working age.
Of those, perhaps half (~2 billion) are sitting at a computer for a living or staring at a smartphone, or otherwise attached to their technology (or will be by 2030).6
Let’s assume that they generate $20/hr in economic value.7
Now imagine that scrapping the inefficient devices returned 30 minutes (0.5 hrs) of productivity to each workday.8
Let’s do some multiplication:
2 billion humans *$20/hr *0.5 hrs/day *250 workdays/yr ~ $5 trillion/yr
That’s roughly the GDP of Japan, which happens to be the world’s third-largest economy.
Technology that offers that type of economic potential will be developed. What if, during your next meeting, you could simply make a note for yourself to pick up that mango chutney you like from Trader Joe’s without the distraction of surreptitiously pulling up some app on your laptop, then jotting down the relevant thought, then attempting to refocus upon the meeting’s speaker, then trying to recall where you wrote that note down, then feeling guilty about missing some of the content of the meeting while some sector of your brain was trying to plan dinner? What if your next Google search was as simple as thinking about the question to which an answer was required? What human potential would that unlock?
But this poses significant ethical questions beyond the obvious technical challenges. Will the corporate behemoths who interrupt surfing the web with advertisements now interrupt your thoughts? Will the state have access to your prurient impulses and politically incorrect opinions? Will your spouse? When those trillions of economic productivity emerge, who will realize the gains? Will you complete your professional responsibilities in 6 hours rather than 8? Doubtful (see: pagers, blackberries, emails, shared file systems, etc). Will you receive greater compensation as a result of your increased productivity? Possibly (depending on if you have any ownership interest in your employer or other tech-enabled entities). Will you have the ability to utilize the technology for your own interests while avoiding some big-brother-esque employer “legilimency?”
If developed responsibly, thoughtfully, and with principles of autonomy and agency at its core, a brighter future awaits. In this future, the paralyzed regain functionality, people with neurological disorders receive compassionate, brain-specific care, those suffering from depression are treated more effectively, and the digital worker’s productivity grows exponentially. And everyone retains their dignity, privacy, and sovereignty.
1 He also hooks up with Penelope Cruz and Cameron Diaz whilst in this state I suppose if humans obtain the capacity to simulate events that are ultimately experienced as real, the phrase “sex sells” will assume an entirely new meaning.
2 Torture, brainwashing, psychotic experiences from accidental viewing of traumatic episodes, and other common elements in dramatic representations of the military-industrial complex.
3 Actually, given the work from home shift in recent years and the recent holidays, it seems the sartorial requirements for modern employment are far more lenient than those on the technological side.
4 A paralyzed human thinks about the motion of the hand that would have written letters. An algorithm decodes the letters intended from motor cortex activity via recurrent neural network. Another algorithm can then autocomplete words and sentences, proofread to smooth the inevitable errors, and so on. There are already published manuscripts addressing this topic.
6 There may be fewer than 50% of all employed persons engaged in computer-based work considering the number of agrarians, laborers, service professionals, etc…However, there are also quite a few humans, both students and retired folks who are attached to technology throughout the day, so I’ll stand by the rough heuristic.
7 This is stunningly conservative, given that full-time employment is roughly 2,000hrs/year, which would then imply $40,000 of economic productivity, and of course, if a knowledge worker is paid $100K+, he or she had better be worth considerably more to their employer. Of course, not everyone is a knowledge worker in the western world, not everyone’s employer deploys their aptitudes efficiently, etc. However, if you assume individual economic productivity is distributed according to a Pareto distribution (individual wealth certainly is), then even if most digital employees produce less than $20/hr (and I might dispute that), the mean is much higher, since Jeff Bezos and Warren Buffet count too. Moreover, if you consider the GDP/capita of a developed or developing nation, then assume roughly 2000hrs per person per year, then assume half the nation is in the labor force, you receive a figure somewhere around $20-$40/hr (as of 2010, and those figures have increased). Also, the digital labor force is most likely producing more per hour than the non-digital labor force.
8 If you thought the last one was conservative…sheesh. We can imagine doubling productivity with better capture of ideas and thoughts, but we can also imagine 10x or 100x productivity increases, at which point the economic strata discussed are off-the-charts!