Hacker Newsnew | past | comments | ask | show | jobs | submit | EmlynC's commentslogin

There are many open-source solutions for BCI projects like BCI2000 (http://www.schalklab.org/research/bci2000), OpenVibe (http://openvibe.inria.fr/) and EEGLab (https://sccn.ucsd.edu/eeglab/index.php). That's based on the kinds of tools that our customers use. Most of these, aren't as pretty as tool like Neurovis, but in reality most of the information that we make use of to control prosthetics or signal intention involve looking at the temporal-frequency relationships between and within broad regions of the brain. There isn't a lot that you gain from just looking at the brain light up like this for BCI — the main use for a visualisation like this, is as the docs say, for diagnosis and determination of epilepsy since in epilepsy the activity you'd see is much higher than usual.

EEG has better temporal resolution than FMRI; you are measuring the electrical activity rather than vascular changes, the former changes more rapidly than the other. EEG, however, is just the surface activity of the brain, so you don't get information about 'deeper' (physically) brain processes; this is where FMRI is invaluable. EEG is also limited to the size of the electrodes and how many electrodes you can physically place in one location. 256 electrodes on an EEG cap is about the limit you can get to.

Electrocorticography (ECog) involves implanting electrodes on the dura, this can substantially improve the density of electrodes in a given area, however this doesn't measure deep brain activity and we have no way of leaving the electrode grid in place for long periods of time without risking infection. For BCI, we've been able to classify more classes of data using ECog than EEG — research by Kyousuke Kamada and Gerwin Schalk are informative. It's a very promising area if we can work out how to implant the electrodes, seal the skull and telemeter data out.

Magnetoencephalography (MEG) can help with measuring deep brain activity, but there are other tradeoffs to consider. Essentially, the point where we are now is combining multiple techniques to get the best temporal, spatial and frequential compromises.

Thus in answer to your question; no FMRI is not great for realtime responses, measuring the electrical activity has better temporal properties so EEG and ECog work better here. Sensor density is part of the problem, but once you solve density you then need to consider how you'll deal with deep brain measurements.

Background; I own a company that distributes BCI equipment for g.tec medical engineering in the UK. We've been operating in this space for 8 years. I have a PhD in Pharmacology and a speciality in electrophysiology.


Thank you so much for taking the time to answer my questions. Looks like I have some reading up to do.

The reason I ask about this is because my long term thinking is that while motion controlled virtual reality is fun and going to be good for exercise games and training, I think human-brain-computer interface represents the most promising of the interaction methods (nothing says you can't be hooked up and still typing, using a mouse, or motioncontrolling either).

You mentioned the medical desire of seeing deeper brain waves, but I just want to control a computer, so given the current state of the EEG's are they useful yet as a practical operating system control input?


What gets me is why we don't see more viruses that _deliver_ the patch to fix the vulnerability.

It's perhaps a little more difficult as you'd need a vulnerability to keep spreading the innoculation. Arguably, though you release the virus, let it spread and then trigger the innoculation using a mechanism like calling out to a webserver, just as the kill switch worked here.


You run the risk of jail time without the upside of ransom payments.


True, plus, I forget the legislation but you are effectively breaking into the computer first which is a crime. Committing a crime for a noble outcome is still a crime.

Incentives is a real issue here and those that provide the patch would, reasonably, expect a reward i.e. MS for updates, AV provider for testing, finding and securing the vulnerability and a whitehat for disclosure. However, there is no reason why a "charitable" hacking group wouldn't do this as part of some sort of digital vigilantism. Sometimes people do things without extrinsic reward and the thrill here is that it is as hard as cracking, but you get to know that your efforts could be immediately applied.


That's an interesting idea: release a virus to cite a virus. Reminds me of the game Uplink, where [spoiler alert] you choose to either help spread a virus to destroy the Internet, or help spread a "counter-virus", hacking large servers to cure them before they're overrun. Digital vigilantism, that's what that is.


"Digital vigilantism" that's exactly it.


While it doesn't have an inherent advantage, it has the mindshare and momentum of a community that has these tools now.

R could be just as capable as Python, but I think Python has largely won the race to be the most popular language for data analysis which in turn encourage more developers to commit to it, cementing Python's advantage.

R still has solid lead in statistics and a good mindshare amongst academics.


> R could be just as capable as Python, but I think Python has largely won the race to be the most popular language for data analysis which in turn encourage more developers to commit to it, cementing Python's advantage.

Your comparing Apples and Oranges. R is a domain specific language and will never be a general purpose language.

It is not true that Python won any race in statistics. http://www.kdnuggets.com/2015/05/r-vs-python-data-science.ht...

Let alone in industry investment coming from Microsoft and other major players.

R is above Python in Statistics in momentum and numbers. Python is a good choice but Python is still playing catch up to R due to the speed at which R is developing. R with data.table and Hadleyverse (https://www.r-bloggers.com/welcome-to-the-hadleyverse/) and RStudio the momentum has been clearly on the side of R.

R just 5 years ago was a fraction of the users it has today.

Python and R are both good choices with equal speed but the difference is that R is a domain specific language that has a lot of positive ecco system.


R is a LISP. I would disagree heavily with it being domain-specific. It is as capable and Turing complete as any language. The only argument you can create is about performance and the judiciousness of putting stats functions in the base library, as opposed to Common Lisp which ships with even less. Not only "will" it be a general purpose programming language, it already is.


This is a really solid response.

I'm a Pharmacologist and I've worked in a number of Biotechs. I have been part of pricing discussions to value our drugs and your assessment is correct — price discrimination, as in most industries, is based on what a particular market can bear. Factors that influence this are the way that healthcare systems are run (i.e. public healthcare, insurance based), the cost of compliance and competition. I wouldn't say that the industry subsidises poor countries via sales in the developed world, it's more a case of it's better to be paid something rather than be paid nothing. I personally, however, do like the happy side-effect this has for patients in poor countries.

Increased regulation costs more to comply with and that cost of course is born by the consumer. The cost of regulatory compliance also varies on the market you sell to, the US being one of the most expensive. Just as with software or hardware, the price reflects the cost of production and maintaining the product.

Notably with biologics (Insulin, any hormone ...) as opposed to small chemical entity (paracetamol, aspirin etc) is that broadly they are harder to keep in their stable active form. The are heat sensitive, chemically sensitive and have a tendency to stick to themselves. This tends to increase the cost of storage, logistics and compliance. It also means that if you can make a worthwhile biologic it will generally experience less competition.


>happy side-effect this has for patients in poor countries

Sometimes I wonder if some HN comments are being posted from an alternate universe...


Ian is linking from twitter for the images so your company is probably blocking twitter. Evidently, however, not the equally large time-sink that is Hacker News!


Disclaimer: I'm one of the organisers

I thought the vibe of the conference was the best we could have hoped for.

Python is crazy popular now — the number of people we had from big blue chips and the fact that Sainsburys (a nationwide grocer for you US chaps) sponsored us was telling of just how deep Python has penetrated the enterprise market.


Animal experimentation is repellant to many people, I can see why. The point is that society has decided that they'd prefer well-researched, empirically-observed working medicines.

Regulation must strike a compromise between allowing research to continue with enough oversight to limit abuse. It is the sort of compromise developers use with security; enough to limit abuse, but not too much that no-one can use the system!


Your point is quite right, I don't think it's self serving at all. Complex physiological diseases like cancer can be observed as tumours grafted to a scaffold in vitro, but you cannot observe the effect of drugs, it's relationship to the whole organism, without using a whole organism.

Monkeys have immune responses that are as similar to humans as we can hope for. Further, Mice while they have some conservation of immune related genes there are many divergent expression of immune related genes including alterations in master-gene (cis) and innate immune gene (CD4) expression [1] which may explain why they don't translate very well into humans. We have an awful time trying to get decent responses to allergic stimuli with mice and guinea pigs! Rodents have very robust immune systems in my experience.

[1] http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3581886/


Unfortunately, science isn't the sort of endeavour where anything can resolutely be considered necessary to start with. Arguably there should be some doubt about whether your research will lead to something important else you can't be doing anything novel. There is nonsense in any endeavour and that is why there are layers of ethics committees, reviews and audits within animal research that limits wanton nonsense. The price of any endeavour is that there will be waste — humans cannot help it — it is unfortunate that the waste is at the expense of animal life.

The first article you link to is about the abuse of frequentist statistics, it is not a piece of nonsense primary research (1182327). The article is correct and an interesting read, but not in support of poor use of animal experiments. The second article (8124111) is about clinical research, not pre-clinical animal research although the arguments raised in the article are applicable to pre-clinical research. It is very difficult to find and expose nonsense research!


I can confirm that this largely rings true for the UK. The Home Office is the body that deals with the certification of a specified procedure and provides the oversight that animal experimentation is done in the most humane manner that is practical for a given experiment.

I recently completed my doctorate in Pharmacology and I have a Home Office license to perform specific surgical procedures on animals. I have training in surgery, anaesthesia and euthanasia which is routinely assessed to maintain my Home Office license.

The details of the experimental protocol that the poster lists - deprive resource, validate behaviour and reward - are probably correct, sensationalised, but correct. I say that, because this reads like the sort of pain protocols I have been involved with in the past. The simple fact is that to explore whether a pain medication works, you must inflict pain, see whether your drug modifies pain behaviour and repeat to sufficient statistical power.

The details of the experimenter dispatching (killing, euthanising ...) the animals with chloroform, en masse, in a plastic bag is a violation of his license and his colleagues license, it is not a listed method on Schedule 1 Method of Euthanasia. It is one of the facets of animal experimentation regulation that I am proud of, the humane killing of animals ensures minimal distress. The euthanasia regulation for animal experimentation are stricter than those in the food industry and far more stricter than the exemptions that religious faiths have when preparing meat.

Non-vertebrates animals are covered by the ASPA 1986 guidelines in the UK for euthanasia and commonly you must present an anaesthetic regime for all uses of laboratory animals. Octopuses are explicitly named as they are one of the oldest laboratory animals in use — they were pivotal in understanding the role of ions and ion channel pumps in nervous transmission.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: