TubeTron 3000: the human implant :reyes:

In the continous struggle between humans and technology abuse, here’s the next chapter: a chip in a tube implanted in the human body.
If Chuck can’t go into a tube computer, the tube computer goes into Chuck.


http://www.openbaring.org/biochiptecheng.html


And some employees seem to enjoy all this only to avoid using a card to buy snacks in a machine…

To me, this is totally foolish, and must be avoided by laws. Matter for the next adventure game, though.

2 Likes

I wonder how the staffs at the airport would react when their detectors raised an alarm. However, it might be difficult to explain believably what it is, because you cannot really show it to them. Either they would believe you or they would deter you from your departure. Also, you never know what the chip reveals about you. There are microphones, radio transmitters and GPS antennas nowadays, which are small enough to be added into such an implant.
There were already low-priced kitchen machines from China containing bugging devices.

2 Likes

Yes, and this is only one of the many problems that are created by the bad use of such a device.
The idea of allowing a company to produce alterations of the body of the employees and implanting a chip under the skin is insane. Whether or not it contains a GPS antenna (which would be at the limit of entering a dystopian world).

2 Likes

Very interesting…

Chuck has the chip tube implanted, but he is unaware that Sleeper Agent Ray has been uploaded onto the chip. Thus begins an Innerspace style adventure where Ray has to travel around until she finds his brain and “deactivate” Chuck.

There would be plenty of room for disgusting jokes about bodily fluids with that plot. We all know how much Ron loves those! :stuck_out_tongue_winking_eye:

2 Likes

I don’t want anything implanted into my body! Never
It’s one step further to a evil future, like Twin Peaks is showing us. Evil is everywhere, even under your skin…

It’s funny, lot of moneys will be invested to extend human life and cure any kind of deseases, meanwhile nasty people who govern us are ready to launch nuclear weapons against each others.

Does that include pacemakers and artificial hips and shoulders?

Got a light?

2 Likes

As long as Google pay taxes like every worker should in the countries where it makes money (which it doesn’t, and taxes are used to make public health care work for everyone) and it doesn’t pretend to choose in what direction medicine should go… they are welcome if they invest in finding cure to diseases.

@milanfahrnholz That is the line: if you implant a device to cure a disease (a heart bypass) or to make a leg or an arm continue to move, the implant is welcome.
But that’s not the case. And even in the case of surgery, we should never forget the first law of physician deontology:
“Primum: non nocere”. “First: do no harm”.
More than one time it saved me from useless (and potentially harmful) surgery. Like having my gingiva and mandibular bone open and cut in two places to remove two molar teeth there were absolutely safe and dormant. It took me to go to three different dentists, but in the end it was worth because the third one adopted a safe approach and only said to watch with localized (little and safer) x-rays every 6 months the evolution till the age of 26. After that he was pretty sure that nothing else would happen. That was knowledge. Nothing happened to me and I’m still safe and sane.
“Primum: non nocere”.

2 Likes

All I did was oppose that broad generalisation, everything else I of course agree with.

1 Like

Regardless of the topic, I have never agreed with the generic line of thought: “I don’t like X, so X is intrinsically bad, so laws should be made to limit other people’s freedom to do X.”.

In my opinion, the decision to “augment” in any way your own body should be personal. People should be helped and educated so that they know both pros and cons of these decisions, but the final decision should be theirs.

But what if the majority of the people are using it? Then you maybe have to use it too. For example I don’t want to use Whatsapp. But I have to because I’m in a friendly society where all members are using Whatsapp. If I don’t use Whatsapp it would be impossible for me (or at least “hard”) to be still a member of the society. So the “social pressure” could lead to the situation that you have to use a technology that you won’t like to use.

2 Likes

Even in that case, I still think that there shouldn’t be a law that limits people’s freedom to install something on their phone or in their body.

A lot of new social tendencies can create negative effects if taken to the extreme but I think that fear-fed laws should be made only if that fear is supported by evidence that there is a good chance that a hypothetical catastrophic scenario might actually happen.

Also, not making a law today doesn’t prevent a country to make a law in the future, in case there is increasing evidence that something bad might actually happen.

“I don’t like the cloning of living humans, so the cloning of living humans is intrinsically bad, so laws should be made to limit other people’s freedom to do cloning of living humans.

Generalisation works both ways. But I agree that the reasoning is bad in any case (“because I don´t like”).

I just remembered why I stopped “being a member of society” years ago…

I have failed to distinguish the two different ways. If I have understood correctly the goal of your example, I would say that it’s irrelevant if X is something that everybody accepts or something that everybody rejects.

Well you said “regardless of the topic”, and I think the topic is very important. But I agree that the reasoning has to be better than “I don´t like it”

Same to me, that’s a big problem. You are somehow forced to adopt commercial or useless technology by the global tendencies and technology is way more forcing than other situations. I used iphone 3G for a decade and probably I would still use it if I hadn’t fix the battery couple times (breaking the internal speaker). So I was forced to buy a new iphone, whatsapp included…

And it’s a point of non return. That’s why I think some tendencies are very dangerous once you accept them. And our privacy is never been so futile

Back on school, I learned how precious privacy is and how risky totalitarianism would be. Now I notice that this lesson had obviously no effect on us. It’s similar to the pollution of the environment: We do it, even though we know better, just because everyone else does it, too.
Mankind is intelligent enough to cogitate on something clear-sightedly, but still too silly for acting clear-sightedly.

2 Likes