Godlike Productions - Discussion Forum
Users Online Now: 1,388 (Who's On?)Visitors Today: 813,173
Pageviews Today: 1,199,414Threads Today: 258Posts Today: 5,405
11:44 AM


Rate this Thread

Absolute BS Crap Reasonable Nice Amazing
 

Neuralink, mind control and the law

 
Anonymous Coward
User ID: 83443725
United States
11/15/2022 03:50 PM
Report Abusive Post
Report Copyright Violation
Neuralink, mind control and the law
It's a long talk. I posted the transcript to go along with it. It gets into moral security and brain hacking toward the latter half.




0:02
welcome to this online interview on eurolink mind control and the law on the weekend elon musk provided a live
0:10
demonstration of neurolinx technology using pigs with surgically implanted brain monitoring devices
0:16
but what does this mean from a legal and ethical perspective what are the implications of this type of technology that interfaces between
0:23
the human brain and computer devices to find out the australian society for
0:28
computers in the law invited dr michelle sharp and dr alan mckay
0:34
to explore these issues and to call for urgent action in these uncharted waters now over to
0:40
michelle and alan eurolink um what are some of the sort of therapeutic
0:46
aims of this kind of technology so the uh immediate uh aims that uh
0:53
elon musk announced um at the weekend sydney time was to
1:01
restore uh capacities to people who may have something like a
1:06
spinal cord injury or difficulty in communicating because
1:12
of um of some sort of uh some sort of um uh health condition and uh
1:20
but um there was a bit of a sort of peering into the future into uh how other uh
1:29
other forms of um uh uses for this technology uh i could say
1:35
something about that or we can yeah i mean what i i mean i i suppose
1:41
some of it it's interesting i guess just to start with sort of exploring some of what's already considered to be
1:48
some of the therapeutic uses alzheimer's and depression and epilepsy and that kind of thing
1:55
but um there does seem immediately um a sort of sci-fi black
2:02
mirror you know speculation about what this could you know allow for um i mean what what
2:09
what are some of the uses or abuses that you think this technology might allow for
2:17
uh one one to talk about neurotechnology generally
2:24
uh and probably including um uh neuralink i mean once once you can
2:31
uh interact uh by brain to to computer
2:37
uh you can do things like um control uh a mouse that you know could sorry
2:43
control a cursor or uh control some sort of device
2:48
like a wheelchair or perhaps control a car or a drone
2:54
and then of course um once you do those things presumably you could also commit
3:00
offenses with that kind of control and then
3:05
another aspect of uh you know that's sort of more um using
3:13
your technology to control things but another aspect is
3:19
stimulating the brain in order to maybe address things like depression as
3:25
you mentioned and you know that raises the possibility of
3:31
all the stimulation under stimulation perhaps generating things like
3:37
impulsivity or something that might lead to uh might lead to trouble
3:42
uh so there's you know there's a great things that are possible from neuro
3:49
technologies including brain compute interfaces but um with
3:55
problems and one of the problems that you've identified is over stimulation or
4:01
under stimulation of the brain so we would have what the capacity to
4:08
alter ourselves alter our personalities yeah i think the um i think the
4:16
the idea is that um brain stimulation could uh address
4:23
uh various aspects of a person's mental condition that are
4:28
undesirable uh to them and um and provide them a means of alteration
4:36
uh you know other than something like cognitive behavior therapy or or medication uh for example
4:43
and um that uh that kind of those kind of alterations
4:49
might go wrong i have unintended side effects and that sort of thing um
4:56
some of the difficulties though i don't wanna i don't appear too uh too sort of uh downbeat about it
5:03
because i as i say i do think there are really wonderful i mean the the possibility of restoring
5:10
a person who's got locked in syndrome who can't move that you know the
5:17
possibility of restoring those capacities is just wonderful so yeah i mean and
5:23
things like alzheimer's or people um suffering from severe depression which can be debilitating
5:30
um that the therapeutic benefits to them are you know immense um but
5:38
um talking about how this device allows us to control imp
5:45
pulses what what does this mean for crime and um for offenders
5:52
who you know say for example um you know violent offenders who have you
5:58
know strong aggressive you know violent impulses what what does this technology mean for them
6:03
and what does it mean in the hands of the state well we've got a somewhat um sketchy
6:11
picture of what might be possible from uh elon elon musk's company and so maybe i'll
6:18
i'll talk sort of more broadly about what what uh neurotechnology might
6:23
um might make possible um so i've along with um
6:31
dr nicole vincent from uts and associate professor thomas
6:37
needlehoffer uh the college of charleston in america as uh marina mentioned we've got a new
6:45
book uh which has come out and in that book um frederick gilbert and and susan dodds
6:52
have got a a chapter in which they consider the possibility of um
6:59
[Music] of brain implants that could uh detect
7:05
the neural uh activity associated with a
7:10
something like an impulsive angry outburst [Music] now this this doesn't this may not be
7:16
too sort of far-fetched because um you know there's uh quite a lot of work already being done in
7:23
in uh another great thing a technology that could detect epileptic seizures
7:30
and the neural activity and then perhaps issue a warning to somebody uh that um you know they might want to
7:38
who who's about to have an eclectic fit that they might want to um you know
7:45
uh perhaps take some medication or at least uh move out of a place where they might harm themselves
7:52
and so frederick uh gilbert and susan dodds they consider the possibility of a similar
7:58
warning system uh that might um you know provide some information
8:05
uh perhaps by an sms or something to a person that was uh liable to a violent
8:12
outburst and they call that an advisory system and then perhaps
8:17
you know a person might have the option of um saying okay i'd better just leave the situation or something
8:24
like that or uh to take it a step further they considered what's what they call an automated
8:31
system which detects the onset of an impulsive or
8:36
aggressive outburst and then engages in some sort of brain stimulation
8:42
uh with the um in order to um you know to uh
8:49
avert it uh to sort of calm the person down so they the the advisory system they
8:55
thought would be um increase autonomy so it's sort of morally a good thing it's like uh you
9:03
know sort of restoring or giving a capacity to someone to
9:09
act more autonomously and not be a kind of slave to their impulses it gives them an opportunity to consider
9:17
their actions and make a decision as a moral agent but but what about the automatic
9:22
system that you've foreshadowed well the automatic system they um they were um more
9:30
concerned about so you know one possibility might be to think of
9:35
somebody who really uh you know maybe someone like beg be in the scottish
9:40
uh film trainspotting who had a terrible anger management problem uh if he was there was no sign that he
9:47
wanted to address this but let's say he did want to address it and he did
9:53
decide to get an automated system you know you might think of that as a kind of autonomous decision
10:00
in which he uh has sort of bound himself for the future uh but the more concerning possibility
10:07
was that they um found to be a kind of uh
10:12
diminution of a serious diminution of autonomy and quite concerning was the idea of it
10:18
being imposed by the state um in in such a way
10:24
uh that a person would kind of almost cease to be a proper moral agent they'd lose the moral
10:30
window uh in in some ways and uh they they they find that uh to be uh a great
10:39
a great concern because it was a diminution of a moral agency it could it also be a form of kind of
10:46
preventative punishment um a new kind of crime a thought crime um for which there is an automatic inbed
10:54
kind of response well that's right i mean i think these uh these kind
11:00
of um these kind of uh technologies maybe even more so uh
11:06
brain computer interfaces for control raise the question of um
11:11
you know what's the criminal action uh so that sort of segways knowing our own
11:18
paper in the book the uh so traditionally uh when somebody commits
11:28
well as you as you know the the law has got the uh concept of the
11:34
mandrea the guilty mind and the act the external component of that
11:40
the external component yeah and that's that's usually um an action which is usually a bodily
11:46
action okay in a mission or status in some in some contexts but most commonly
11:51
it's uh it's a bodily action and um you know uh you let's let's assume
12:00
that you can trigger some sort of external device like a brain drone in
12:05
america they've got brain drone racing competitions uh and let's just let's assume that a
12:13
person um controls a drone by imagining bodily action so let's say
12:21
they they wave their hand they imagine waving their hand and that
12:27
signals their drawing to turn right and uh they've got control by the acts of the imagination uh
12:34
then you might say well what was the criminal act you know what was the thing they did if
12:40
they fly into someone and injure them what was the thing what was the conduct constituting the axis race
12:46
for the um for some form of assault uh was it um was it a mental act so can you have a
12:55
can you have a could the law say well the act of imagining uh waving your hand
13:04
that was the conduct constituting the actor's race that seems a sort of significant step in
13:09
the history of the law yeah it's thus far drawn something of a distinction between
13:16
the men grey was supposed to be the mental part and the race was somehow external um or does the actress raise
13:24
become something new does it become the electrical import pulses
13:29
from the device itself yes so that's that's uh that's one one possibility so
13:36
like if if a judge was um pressed to consider what what the
13:41
conduct constituting the actor's race was they they might uh they might prefer to
13:48
keep the um actors race in a kind of bodily sense say okay well you you kind of jiggled
13:55
your motor cortex seems somewhat artificial
14:01
doesn't it by imagining a hand wave you you jiggled your motor cortex
14:06
or something of that nature uh and the neural activity that's the
14:12
conduct constituting the actus rays or if it was um an implanted device uh so
14:20
let's say um uh the you know mass neurolink is is implanted you might say well the
14:27
uh device is part of the agent there the pendant independence kind of a
14:34
cyborg and they've flowed signal to the uh through the device
14:42
and that that was part of their body and then you might have a question well where does the agent end and the
14:49
uh defendant um sorry cyberspace begin uh maybe it's not so hard maybe that is
14:57
less of an issue for the neurolink technology because it does seem to end
15:02
at the you know at the at the skull because it's not there's no wire
15:08
older forms of brain computer interfaces had wires leading to the computer whereas at
15:14
least this one is goes through the air you know to
15:19
bluetooth or something in a way i mean how is it any different from
15:24
um you know take for example the example i think you use in your book of revenge porn so so how
15:31
is it any different from causing the electricity electrical impulses in my brain to move a cursor
15:37
from causing the impulses my brain to use my hand to move the cursor
15:42
i mean how what serious what's the difference really so i think there's no moral difference
15:48
and you know like i don't think anybody should be acquitted there's i can't see
15:53
any moral reason to equip anyone of an offense uh just because they acted by way of
16:00
bringing computer interface rather than traditional bodily action they should they should be on the bar
16:05
on a par but i i guess there's just this um kind of uh theoretical
16:12
uh question i suppose you know like if there was some sort of question of temporal coincidence a judge might
16:18
actually have to say well uh you know what was the so you know in order to consider
16:24
whether men drea and actress reyes occurred at the same time uh a judge might have to decide what it
16:30
was what was the thing that was supposed to have occurred at the same time uh
16:36
but but yeah i i i don't think i don't really for a moment think that um that
16:44
you know a person would be found not guilty you know the course will say something is the conduct constituting
16:50
the actors race it's just uh kind of uh interesting sort of theoretical question
16:56
as to to what was it because the it seems like legal doctrine is is um somewhat set up with a kind of
17:04
pre-proposition behind it and the presupposition is that they people act through their
17:09
bodies and so i don't think it'll you know proceed on they'll get
17:16
through it and they'll quit the people will be convicted that ought to be convicted but um
17:21
but you you think it might cause a rethinking of actors rights or or or will it cause us to to
17:27
abandon it all altogether i mean we do with some offenses yeah that's
17:32
i mean the idea of a back abandoning actus reyes seems uh drastic um
17:40
you know so um but i i guess it just it just forces this theoretical question
17:46
i mean i think uh it's hard to say you know what what might be decided to be the conduct
17:53
constituting the accident race i mean our canvas three options mental act neural activity uh flow of
18:00
signal or or they could go for some kind of combination uh but but either way i i think you know
18:08
when it goes up to some sort of uh court high up in the hierarchy and they some some you know some judges make a
18:15
decision i think it will be something to um you know i
18:21
i i guess like uh the the idea of wrestling with uh technology is not a new thing
18:28
like we can remember the internet and you know some of the the difficulties um you know
18:36
in adapting to internet crimes but but that's more like chapter 11. this is
18:41
a bit more chapter one isn't it and it's sort of uh uh a bit of a raises a bit more sort of
18:49
questions that might be more of interested at legal philosophers and um philosophers of action and that sort
18:55
of thing yeah i mean you also write a bit about free will and and and in your most recent book you
19:03
know as we've discussed you you talk about the re-thinking of actress rose but i also wondered whether it would prompt us
19:09
to rethink men's rights how sensitive are these neuro links i mean can they um you know examine and
19:17
act on our thoughts before we're even aware of a thought and and so i was thinking you know take the example
19:24
you have in your book about actress raz and and revenge porn so you know what what
19:31
what is the act when somebody is moving a cursor to distribute um
19:36
you know to to distribute a an an intimate um you know image um
19:43
also combined with the fact that elon musk himself in his announcement has said that these
19:49
links will potentially in the future allow us to record and download our memories oh yeah
19:57
yeah i mean is there i mean i immediately thought of that chapter you write in revenge porn but you know it made me think you know
20:04
is it is it possible that sitting in front of a computer that is interfaced with this uh
20:10
link um you know is there scope for um you know inadvertently
20:17
downloading you know intimate you know memories and distribution of them and what does that
20:22
mean for things like revenge porn and the actress you're the men's route
20:27
the the i mean there's one i think there's a couple of there's a few things there i mean like
20:33
one one thing is there's an interesting question about uh control as it comes to
20:40
mental acts you know so these devices the um they have to be set up in such a
20:47
way as the device doesn't affect an action that the the person
20:53
doesn't want doesn't want to happen uh but but then you don't want to make it too difficult so they
20:59
kind of have to jump through eight hoops before anything gets done so there's this kind of fine
21:04
line between uh making it too easy and too hard to uh you know to give to give effect to
21:12
to actions and there's an interesting question that maybe we'll learn turn you know we'll find out in
21:19
due course is um you know whether people have got less
21:24
control over their mental acts and their than their bodily acts
21:30
you know like it's if you say don't raise your hand it's easy for me to not raise my hand if
21:36
you just say don't imagine imag don't imagine waving your hand that might be a bit harder and i might have a bit less control so
21:42
that's one thing um the memory thing i mean that that also
21:49
uh is very interesting uh i this seems to be more further down a bit
21:55
further down the track but the um in the uh i mean there's other companies
22:02
that are looking at looking at restoring memory and um particularly we mentioned people
22:09
or you mentioned people with alzheimer's and that you know that's a wonderful thing if somebody manages to
22:15
uh to crack that problem uh but um in the uh
22:21
in the uh book um walter glanen he considers what's known
22:27
as hippocampal prosthetics and these are devices that
22:32
are um uh sort of um the idea is that they can store memories
22:40
[Music] and um
22:45
you know i suppose theoretically if memories can be stored then maybe they could be inadvertently distributed
22:54
and uh i from from it the the legislation i
23:00
looked at in the uh paper i think uh is the intimate image images like from new south wales and i
23:07
think the men's real was um intention and recklessness as well i
23:12
think it was i don't think it was a negligent one uh but but maybe they they would have to
23:18
consider that yeah but um
23:24
yeah hippocampal crysta the whole idea of of memory is is quite a quite an
23:29
interesting thing um because you know like you think about something like gross
23:35
negligence manslaughter where somebody forgets to do something leaving a
23:41
child in the car or something like that and um yep walter glenn in the book explores that
23:47
possibility and considers the question of um you know
23:55
let's assume uh some sort of um hippocampal prosthetic device fails and the person doesn't actually
24:01
store a memory because this device fails maybe it's never failed before
24:07
and it fails and they don't retrieve the memory because there was no memory
24:13
there for them to retrieve there's kind of this interesting question like um
24:20
who's responsible for that yeah did they forget are they to be seen
24:25
as a sort of cyborg and the system forgot or are they a person who is using a tool
24:33
this hippocampal thing and the they didn't forget the the just there was this kind of
24:39
malfunction in a device they were using so that that could be an interesting argument uh
24:45
yeah and who's resp yeah who's responsible for that i mean i mean also i i suppose that
24:51
ties back to what we were talking about earlier in terms of a use by the state
24:57
um or individuals voluntarily to control certain anti-social impulses
25:03
um can that also create in us not just um where we success acting as a
25:11
moral agent but also a full sense of moral security where we where we stop um you know
25:17
exercising that muscle yes yeah that's uh that's interesting isn't
25:24
it and it seems like if that were to be the case it's it's you know um
25:34
people's lives and arc you know you might so your one view is
25:40
that people's lives are kind of some of choices and if those those choices are too heavily
25:47
uh controlled by external devices um you know it raises some theoretical
25:54
some sort of philosophical questions about what kind of life is left if it's not
26:02
an autonomous life um yeah i mean there might be a
26:08
sort of community protection benefit yeah so so one hand is this this idea of community
26:14
protection but on the other hand you know what is left to us we don't have freedom and and how do you evolve
26:22
morally if you don't have that freedom either yeah yeah yeah i mean i suppose sort of
26:29
uh extrapolating you know you might think well this kid kids don't need any moral
26:35
instruction they they they can just i mean obviously
26:40
we're we're appearing a lot in the future and uh exploring possibilities but um
26:49
maybe it's worth doing that to uh you know to consider some of the shorter
26:56
term steps yeah as you as you say or i think as pointed out
27:02
by one of the authors in your um in your book it's perhaps not that far into the future if we can
27:08
already see and predict epileptic seizures it
27:13
doesn't seem to be a giants to predict certain antisocial impulses
27:20
and and to head those off automatically yeah that that seems right the idea of
27:25
uh machine learning approaches detecting the neural patterns associated with
27:30
impulsive outbursts seems does that doesn't seem quite far into the future if they can do
27:37
with epileptic fits well maybe it was maybe the same um
27:42
i mean this this um i've been doing a bit of work uh recently with um
27:52
a group uh it's called the well we're kind of um
27:59
involved in looking at what's known as neural rights and um at columbia university there's
28:06
uh professor rafael used to use he's quite he's a very eminent neuroscientist he's very concerned about
28:13
um you know technology neurotechnology where it's
28:19
going and he thinks it's very important to consider all this from human rights uh
28:24
perspective and uh and he's you know he's not he's he's a
28:32
a scientist and he's sort of at the cutting edge of of all this at columbia and new york and um you know
28:39
he thinks it's time to consider these things not not just um regard them as
28:46
speculation speculative science fiction well it seems to be that that would then enable us to not um
28:55
respond reactively to these things so you know wait for problems or
29:00
disasters to emerge and wait for the community to be morally outraged in their neck but
29:05
that you know we could um you know um help manage these things
29:11
before they reach kind of you know before there's some kind of um crisis because as you say the
29:18
therapeutic benefits are you know immense um they're amends
29:23
i i i mean one one of the things that struck me watching the um neuralink announcement um
29:31
over the weekend was that uh they had a range of uh people involved in the project
29:38
uh and they did a bit of work to um allay concerns about the treatment
29:45
of the animals um uh but um and everybody else seemed to be some form of
29:52
uh scientist uh you know our engineer um and i i i wondered if it might be
30:00
good to have you know sitting on the on the road with them some kind of ethicist
30:06
uh yes you know explaining something about some of the steps they were taking to
30:11
i think they did manage i think i think they did mention um some cyber security privacy risks but i
30:19
think there's others and uh you know it that might have been a good thing to
30:24
um to do yeah i i i agree you know what are the ethical and legal implications
30:30
because one of the other things that struck me about the presentation is that that um we were told it's
30:38
not going to be a closed system and and so um you one wonders
30:45
what might be the scope in terms of brain hacking
30:50
yeah hacking into somebody's brain and um you know already there are concerns with
30:56
um social platforms like facebook and how that can be manipulated yeah
31:02
so how governments how businesses might even be able to have access to to um
31:08
thoughts and desires that we may not even be consciously aware of and how that information might be manipulated
31:14
yeah yeah that's right uh i think um the the sort of um historian and
31:23
futurist yuval noah harari he's he's he's uh considered some of some of these
31:28
issues but more specifically the uh the sort of brain jacking uh
31:34
thing um what one of the uh people that i'm involved in working with at the moment
31:40
uh mark marcelo ayanca from zurich he's he's written on
31:45
yeah that that issue that um what what if uh if somebody um hacks into
31:52
a device and um and cause it say to stimulate or uh a time when
32:00
that that shouldn't be done uh and someone behaves differently so this is quite uh it's sort of quite
32:08
an amazing thing uh to think of but it's um
32:14
yeah that's something that needs to be considered yeah i mean i know at the moment with mobile phones
32:20
um you know for example you you could be walking past a store and it tracks you and then you you'll
32:27
get a little adverse advertisement you know hey this store has 30
32:32
on you know this particular product why don't you go in um can can you imagine something like
32:39
that but in your brain where it might be indistinguishable from your own authentic thoughts yes
32:47
advertisers dream consumers nightmare do you think yes yeah yeah the um i mean people are
32:54
talking about uh sort of right to mental privacy and right to mental integrity and there's a
33:01
there's a bit of work being done mainly by philosophers and not so much lawyers on on that and um
33:09
a lot of the human rights um documents were you know a lot of like the universal
33:15
declaration created a long time ago before uh things like this seemed uh possible
33:21
and um you know i think my understanding is like in in in chile they're
33:27
re-examining the constitution there's a bit of lobbying to get something about um mental privacy in it uh
33:34
but this is we've we've sort of talked a bit about uh criminal law but i i
33:42
i was wondering um you know have you got any thoughts about implications of these kind of
33:48
technologies for the civil law or you know consumer
33:53
law any any thoughts on that well absolutely i mean i think the implications are immense um and to some extent
34:00
um they mirror i think some of the concerns that you highlight in criminal law you know issues around free
34:07
will and consent you know entry into contracts i mean this australian consumer law
34:12
the reason why it exists it's premised on this idea that you know a free economy is an efficient
34:18
economy you know it's a common an economy that benefits all you know within the australian society
34:25
so things like misleading and deceptive conduct and unconscionable conduct
34:31
which disrupts essentially to free thought is is prohibited but um
34:38
you know imagine um you know an implant which um undermines
34:44
the integrity of free thought and consent yeah that i mean we talk about i i i
34:51
know that you know there's real concerns about um you know privacy and how data is
34:57
being used you know in terms of the purchases we make and you know the things that we search
35:04
through social platforms and online but um uh imagine something
35:09
even closer to us you know imagine a a a a link that's not a closed system that
35:17
interfaces with computers that interfaces with others that interfaces with business potentially businesses and and apps
35:24
yeah um you know what kind of baby of information that that might provide
35:30
businesses how that might be um uh used or misused yeah
35:36
and and maybe even the implanting of thoughts you know if um i suppose we could all imagine
35:42
sort of the extreme where you know a state or a terrorist group or a near do well might hijack somebody's
35:50
brain but um you know perhaps you know we could think of um did there say someone closer to home
35:58
you know a business you know what it might be able to do with that kind of information and that
36:04
kind of power or or even even if implanting a thought was uh was a bit too technically
36:10
difficult you know just uh just uh stimulating your brain to um
36:19
trigger impulsivity or something like that yeah yeah you you're walking that's an easier
36:24
shorter term one yeah and that that wouldn't seem to be terribly difficult even with the current technology so let's say
36:31
um you know you geographically um you know through gps
36:38
you're being identified as passing a you know certain um fast food store then you get an impulse
36:45
you're hungry um that wouldn't seem too fantastical a thing to imagine would
36:51
you agree yeah well possibly yeah yeah the um i think another area is uh
36:59
employment law you know the idea of monitoring uh say the driver of a heavy goods
37:06
vehicles brain to detect for signs of drowsiness uh there's kind of i think there's quite
37:14
a lot of um there's quite a lot in in this for employment lawyers to consider or the
37:20
question of whether uh an employer
37:26
might uh somehow be under might might require uh some sort of brain technology
37:34
um yeah and you know it might lead to somewhat for some people
37:40
somewhat dystopic uh sort of competition in uh
37:46
in which uh employees are competing with each other in eurotech to uh stimulate their
37:53
concentration stimulate their uh yeah that that that that's an i mean that scenario that
38:00
you've you've just posited seems to have two potential implications one i suppose is this kind of slippery
38:08
slope so it starts perhaps from you know a good place you know we just want to monitor our employees to make sure
38:14
they're safe occupational health and safety but you can see how that can be a slippery slope but then but then the the other
38:20
potential um uh you know area is um this idea that
38:26
um this scope for us to become part human part machine kind of cyborg and then what that might
38:33
do particularly for society that if you have the means to you know increase your ability to
38:40
retain information to process information to stay awake i mean what what what does that do
38:46
society for those people have the resources to access that kind of technology
38:51
yeah i mean it's uh it sort of raises the again i i've been quite interested in the
38:57
the the work of of uh yuval noah harari and he's he's in he's who's not not a legal
39:02
theorist but he's he's got some interesting ideas but he's envisaged a split
39:07
uh in which you you you can imagine uh the enhanced and the unenhanced and
39:15
you know the sort of ethical problems i mean like one one issue is the uh
39:21
question of um how widespread the technology
39:28
turns out to be uh i think um elon musk was saying he was hoping to
39:35
at some point get it down to a couple of thousand dollars yeah you know maybe it'll come down to
39:40
the maybe i don't think it's necessarily
39:45
the key you know it's i suppose it's one possibility is that it does turn out to be
39:52
used to to to be possible to make quite cheaply
39:59
but then you know maybe people are going to keep getting upgrades and that would be the
40:05
wealthy people yeah based base model and upgrades
40:11
yeah that's right yeah yeah but i i i saw it it was like something out of a sci-fi
40:17
movie or a black mirror episode that you know in order to do it um
40:22
as safely and cheaply as possible and he's developed this machine so it's not a human being who insists
40:30
you know it's a machine um so i mean another film that made me think of was total recall
40:36
oh yeah no you see arnold schwarzenegger you know yeah yeah so it doesn't seem so crazy
40:43
that we we have this machine that you know can insert everybody with a link but but in terms of the the upgrades and
40:50
what you can afford maybe it creates a new class system perhaps yeah yeah that's that's that does seem to be
40:57
one of the um one of the concerns um
41:02
i mean what elon musk has said he he's he's uh worried about some sort of
41:09
technical technological singularity and he thinks an upside is that it might allow
41:14
humans to keep peace with the eye but that's uh but it might only be some
41:21
humans that that uh um if if that is a problem i'm not saying that is a problem but if
41:27
if he's right um yeah it's uh i mean what i think one thing that is
41:33
interesting is um my feeling about this
41:39
is there's a lot a lot of ethicists that are um engaged in considering this and um
41:47
there's a lot of uh and some scientists like rafael used from colombia uh but um
41:55
there's not a huge number of uh lawyers or legal scholars engaged in this um you know there's an
42:02
area of scholarly research called neural law and there's been a lot of discussion about
42:08
what you know what brain scans uh might be used for in the criminal
42:14
justice system and and also in civil matters as well but testamentary capacity and that sort
42:20
of thing but the um there's not a great deal about
42:26
intervening on brains and using brains to control devices and and
42:33
this was really one of the things that um that led
42:38
uh my friend and colleague dr nicole vincent to start the project
42:44
uh but but um another thing is there's i think the the whole a lot of people
42:51
have focused on the crimp lightning focused on the criminal wars hide but um i think there's there's not really
42:58
enough there's not enough legal attention to this and also within the legal attention that
43:05
exists there's not enough there's certainly not enough in uh areas outside criminal law yeah so
43:11
there's uh there's a lot of work um to be done by lawyers i think and
43:16
law reform bodies and yeah um you know i i feel it's it's not it's not
43:22
uh got enough attention from lawyers that's what i did yeah well i i would i would be um the risk of
43:29
sounding self-interested i would be as a lawyer inclined to um agree with you and and particularly
43:34
um being deeply interested in consumer protection um what i tend to see is nothing quite
43:41
motivates like money yeah and you know while while the you know
43:46
there's certainly some aspects to the sort of the criminal law and state control that's so deeply troubling
43:52
um you know i wonder whether in fact the greatest threat might be from corporates um and terms of scammers
44:00
to um not so much you know gain power but in order to you know increase their
44:06
coffers you know what can be done with that kind of information because um with the technology we currently have
44:13
we already see that happening you know we we see you know use and abuse of you know data from mobile
44:19
phones from social media and very very subtly so um it does you know raises a question
44:28
about um you know how this might be used or or abused by by businesses
44:35
yeah but particularly when um right from the outset it's not intended i mean as elon musk
44:42
has um you know declared quite proudly this isn't going to be a close system this is
44:47
a way for us to interface with with the internet with computers and and with each other
44:53
so you know the scope for abuse seems to be vast i think yeah and they
45:00
also in in terms of um uh i think another thing that's worth
45:07
considering is the in terms of the regulation of this there's um you know there's implanted devices like
45:15
uh like eurolinx device and then there's other forms of um
45:21
non-invasive um brain computer interfaces uh that are marketed um
45:28
direct to direct to consumers and uh they they they they sort of avoid the
45:35
regula the uh medical sort of regulatory scheme because they're they're not you
45:42
know they're just reading your brain they're not stimulating they're not invasive it doesn't involve surgery
45:49
and so there's a there's a question about this sort of regulatory scheme i think um yep how these things are managed and
45:56
try to um you know prevent harm to the the community um yeah
46:04
yeah yeah a lot too
46:10
i think that's the that's right i think that's the upshot of it all there's a lot to it's like we've only just started
46:17
thinking about that and there's a lot more to think about and uh in many areas of law
46:23
and it's we're just beginning yeah and and i think the time actually to think about it is is now
46:30
because you know if we don't um then we'll just be playing catch up won't we i mean
46:36
we'll just be reacting to you know every problem that emerges but instead and i think this kind of thing
46:43
so yeah yeah and i think this comes back to the point that you're making that you know perhaps the the innovators
46:48
should be working alongside with you know ethicists and your lawyers now yeah so that they can
46:55
craft these devices in a way to potentially you know head off these problems before
47:01
they emerge because i mean i think because of the the therapeutic benefits are so great the last thing you want is
47:08
for yeah to stop it or you know for a state to out of fear because something you
47:14
know unforeseen is is has happened yeah um you know to really clamp down hard on this i mean
47:19
this is the last thing i think we would want that's right yes no i agree yeah yeah now that would be terrible because
47:25
there's a wonderful upside to to to it as well as these pitfalls so yeah
47:31
i certainly wouldn't want it to to stop
47:37
because we can keep we can keep going on but uh it's uh it's great anyway it's been
47:44
great to uh great to uh discuss this with you i'd like to i'd like to chat further and i hope we i
47:49
hope we do at some point and no i think thank you very much it's um it's really been
47:55
you know really um mind-blowing no pun intended uh you know all the potential
48:01
applications for this kind of technology and certainly um you know in criminal law but but also
48:07
in the areas i'm deeply interested in in terms of formation of contracts and consumer protection i mean um
48:14
you know the mind boggles in terms of the potential application and abuses of it
48:36
you





GLP