Coming of the Chip
It's been around for awhile, a small microchip, about the size of a grain of rice, that can be implanted into the area between the thumb and hand.
A number of companies have used these experimentally for a number of years.
For some reason, Epicenter, a Swedish company, is getting a lot of media attention this week for offering to implant these chips into employees who volunteer to do so.
There are various articles about this, but one article stated that many of many of the employees were chipped two years ago.
"Two years ago, Mr Mesterton told news.com.au many of Epicentre’s employees had already been chipped and used the technology in their everyday life."
http://www.news.com.au/technology/science/human-body/swedish-company-epicenter-implants-microchips-into-employees/news-story/5c48700ebb54262ae389db085593ab12
So, the question is why if the company started the chipping program a couple of years ago, why is there a big media rush on it now?
From my limited observation, the younger generation--probably comprised more of those in their thirties or younger--are generally more supportive of the idea. The older generation are generally more reluctant and even against the idea.
My thought is these chipping programs are being, and will continue to be, reported on in order to get more people okay with and even supportive of the idea.
But it seems the chips are still fairly basic in what they do. However, that will be changing.
As the capability of the chip increases its abilities--such as to perform health monitoring and reporting--and can increasingly interact with external technology, the convenience generations will being clamoring for the technology.
I've previously mentioned, and I still believe this to be the tipping point, the biggest selling point will come with perceived increased personal security. Better identity protection. Keeping track of children. Securing financial and other transactions.
The technology will make it easier and more convenient to interact with technology and maintain security.
It will be the next big step in security. It will mesh biometric identification with an identity token, providing a more secure authentication and authorization method for transactions of all kinds, from simple logins, to entering a room, to paying for something either online or in the physical world.
Many will point fingers to the older generation and the religious "fanatics" claiming they are holding back security by not supporting the implementation of these chips, or personal ID devices. Those who are against it will be painted as not caring about children who can be protected and easier found in the event of a kidnapping. They will be accused of not supporting easier, better methods for securing our digital life.
A big problem is for these chips to work in a world-wide, or even national, environment then there needs to be a commonly accepted standard for them. Otherwise there will be several types being developed and implemented, and it will take longer for a common standard.
Currently we see the various mobile device payment platforms: Apple Pay, Android Pay, Softcard, and various mobile wallets. For retailers to use the various methods, they usually need either additional scanners, or scanners with multiple platform functionality.
A common standard will make it easier for retailers to adopt.
But who becomes responsible for the standards? Most likely some government agency, or government appointed organization.
And, while companies could track transaction information from the devices, it would not take much for all that data to find a home with a government agency, especially if everything is using the same standards.
So, while there would be an appearance of increased security, the reality is privacy is vastly decreased and there is a big question: are we more secure if all our data is stored with the government? Or, does it just make us more dependent on the government, and actually more vulnerable?
The advent of the chip on a broader scale will most likely also mark the transition to a full digital currency.
A number of companies have used these experimentally for a number of years.
For some reason, Epicenter, a Swedish company, is getting a lot of media attention this week for offering to implant these chips into employees who volunteer to do so.
There are various articles about this, but one article stated that many of many of the employees were chipped two years ago.
"Two years ago, Mr Mesterton told news.com.au many of Epicentre’s employees had already been chipped and used the technology in their everyday life."
http://www.news.com.au/technology/science/human-body/swedish-company-epicenter-implants-microchips-into-employees/news-story/5c48700ebb54262ae389db085593ab12
So, the question is why if the company started the chipping program a couple of years ago, why is there a big media rush on it now?
From my limited observation, the younger generation--probably comprised more of those in their thirties or younger--are generally more supportive of the idea. The older generation are generally more reluctant and even against the idea.
My thought is these chipping programs are being, and will continue to be, reported on in order to get more people okay with and even supportive of the idea.
But it seems the chips are still fairly basic in what they do. However, that will be changing.
As the capability of the chip increases its abilities--such as to perform health monitoring and reporting--and can increasingly interact with external technology, the convenience generations will being clamoring for the technology.
I've previously mentioned, and I still believe this to be the tipping point, the biggest selling point will come with perceived increased personal security. Better identity protection. Keeping track of children. Securing financial and other transactions.
The technology will make it easier and more convenient to interact with technology and maintain security.
It will be the next big step in security. It will mesh biometric identification with an identity token, providing a more secure authentication and authorization method for transactions of all kinds, from simple logins, to entering a room, to paying for something either online or in the physical world.
Many will point fingers to the older generation and the religious "fanatics" claiming they are holding back security by not supporting the implementation of these chips, or personal ID devices. Those who are against it will be painted as not caring about children who can be protected and easier found in the event of a kidnapping. They will be accused of not supporting easier, better methods for securing our digital life.
A big problem is for these chips to work in a world-wide, or even national, environment then there needs to be a commonly accepted standard for them. Otherwise there will be several types being developed and implemented, and it will take longer for a common standard.
Currently we see the various mobile device payment platforms: Apple Pay, Android Pay, Softcard, and various mobile wallets. For retailers to use the various methods, they usually need either additional scanners, or scanners with multiple platform functionality.
A common standard will make it easier for retailers to adopt.
But who becomes responsible for the standards? Most likely some government agency, or government appointed organization.
And, while companies could track transaction information from the devices, it would not take much for all that data to find a home with a government agency, especially if everything is using the same standards.
So, while there would be an appearance of increased security, the reality is privacy is vastly decreased and there is a big question: are we more secure if all our data is stored with the government? Or, does it just make us more dependent on the government, and actually more vulnerable?
The advent of the chip on a broader scale will most likely also mark the transition to a full digital currency.
Comments
Post a Comment