MadSci Network: Computer Science |
Dear Matt, This is my first time answering any question, so allow me answer your question in parts. I assume by "language" you mean spoken languages such as "English" or "German". You want to know: a) Could a chip, if implanted directly in the brain, understand a person's thoughts in terms of sentences or "language" from his brain waves? Ans: There is a technology called Brain Computer Interfaces (BCIs) which allows computers to convert a human being's brain wave patterns into meaningful interpretations such as "imagining moving my left hand" (motor imagery), or recording the involuntary responses. [1] But this technology is very far from being able to interpret "thoughts" in terms of sentences or language. For example, the best you might get is that a person is hungry. But getting, "I think I would like a sandwich right now" is quite far away. b) Would/Could such a chip be built using nanotechnology? Ans: I'm not aware of any successful attempts so far being made for specifically this purpose. Though the prospects are promising. [2] Achieving this has two barriers. The first is building chips using nanotechnology (any kind of chips; not specifically for neural implants). We're getting there fast, but we're not there just yet. Once we know how to reliably and effectively manufacture such chips on a large scale, making them talk to the brain is the next problem. Today a lot of research is going into how to get current silicon chips to talk to the brain matter directly (called invasive BCIs). [3] [5] The problem has been almost solved, but due to the risks involved with just going out and putting something inside a person's brain, researchers are taking it slow and experimenting with rats and neural cultures in petri dishes. They need to go through intensive clinial trials first. Neurosurgeons today can connect EEG sensors to a person's brain matter directly during surgery. [6] So we know for sure this can be done. We just need to get it right before we can allow this to be done without the constant supervision of a highly-competent expert neurosurgeon. b) Would language translation is possible and effective? Ans: Thankfully, automated language translation (machine translation) is quite good enough today. SYSTRAN, the leading backend for most of the well- known online translation services such as Babel Fish, is quite okay. It is good enough that a person can understand what the original sentence was trying to convey using the translated sentence. Though don't expect it to provide perfect translation or to maintain certain properties of spoken speech such as implied sarcasam, humor, etc. [4] Naturally the challenges of putting such a translation system in the extremely resource-constrained (in terms of processing power, memory capacity, battery power) chip will need to be figured out. References: 1. http://www.slate.com/id/2113353/ 2. http://www.britannica.com/eb/article- 236444/nanotechnology 3. http://www.n inds.nih.gov/funding/research/npp/index.htm 4. http://www.systransoft.com/ 5. S. P. Levine, J. E. Huggins, S. L. BeMent, R. K. Kushwaha, L. A. Schuh, M. M. Rohde, E. A. Passaro, D. A. Ross, K. V. Elisevich, and B. J. Smith, "A direct brain interface based on event-related potentials," IEEE Trans Rehabil Eng, vol. 8, pp. 180-5, 2000 6. http://cms.clevelandclinic.org/neuroscience/body.cfm?id=977
Try the links in the MadSci Library for more information on Computer Science.