About The Episode
The first Episode of 2020, looking at the updates of automation technologies in the last month as well as exploring the concept of the capitalism of things. I’ll be describing the 3 layers that make up the trend which leads to an idea where smart objects and systems exchange value and trade services autonomously.
Transcript
Intro
I will try and have a holiday episode come out this week or next week. There were quite some audio quality problems during the recording so I’m still trying to make the edits so that the information can be presented without too many problems. I interviewed some family friends, who are each nearing 100 years in age and tried to understand what it was like for them to experience technological changes over the last 100 years, which i think can be quite interesting as we ourselves are moving through a period that will have even more technological change than the last 100 years if the exponential nature of technological growth is to hold true. So will try to have that interview out soon. But today we will briefly look at the idea of the capitalism of things, However, during the course of the podcast’s break there has of course been some relevant developments in the world of tech and automation. So I wanted to spend this episode mostly talking about these as I think it makes the ideas presented in the episodes so far a bit more tangible
Living Robots
So first off, let’s look at something significantly strange to kick off the New Year. The first ever living robot was created. We’ve looked at industrial and service robots before, but what exactly is a living robot or Xenobot as it’s been called? They [Xenobots] are tiny, submillimetre-sized blobs for lack of a better word, containing between 500 and 1,000 cells that can so far move across a petri dish, and even carry very small payloads. They were designed by algorithms and constructed by humans using the stem cells from an African frog. The stem cells are manipulated into skin and heart muscle cells but for a more complex explanation of how this was done I’ve linked an article in the shownotes. The relevant part to know is that the generated skin cells act as a sort of mesh that holds everything together, while the contractions of the heart cell muscles propel the Xenobots. The machines moved around independently for up to a week powered by their own energy stores in the form of lipids and proteins. The researchers at the University of Vermont claim these are novel living machines and that they’re neither a traditional robot nor a known species of animal. It’s a new class of artifact: a living, programmable organism.” Apart from the pretty wild reality of this new technology, not to even mention ethical issues, I think the possibilities for using the ‘custom living machines’ will be what interests most of us. Some examples proposed are delivering medicine to specific areas or cleaning up microplastics from our oceans. But as it is still early we will have to wait and see.
Ban on Facial Recognition
Secondly, for those of us living in the EU it was recently leaked that The European Commission is considering measures to impose a temporary ban on facial recognition technologies used by both public and private actors, according to a draft white paper on Artificial Intelligence. This is the technology that is becoming widespread in China, enabling both the convenient cashierless stores as well as the problematic policed or even as some argue, controlled population. The completed white paper however will come out in late February, and will also deal with AI within Europe on a more general level.
Massive 3D printed building
Then, near the end of 2019 in Dubai a Massive 640 square meter building was unveiled, where the concrete walls were built using a giant 3D printer. It is actually the largest 3D printed building ever, as is making it to the Guinness world record. More relevantly, the building only required half the construction workers to complete and generated 60% less waste compared to traditional construction techniques, it also cut the cost of construction by more than 50%. The shape of the building os if course unique and offers an opportunity to imagine what different kind of structures can be built using the technology. I’ll have a link of it in the shownotes for those interested in seeing it as well as the 3D printer used to build it. Also for those listening on Spotify, iTunes or elsewhere the shownotes can be found on the website. Unfortunately only a short blurb can be added to each episode on these platforms but the website has the links and relevant sources for each episode.
CES
Samsungs AR glasses
Most notably of course was CES 2020 or the consumer electronics show that is held yearly in Las Vegas, where a number of new technologies are typically showcased for the first time. I wanted to talk about a few that are relevant to the previous episodes of this podcast: Firstly are Samsung’s prototype AR glasses and workout exoskeleton. As I personally purchased a VR headset and amazed my family and a few friends with it during the holidays, I was surprised to see that these new glasses are already significantly smaller and lighter than the bulky VR headsets out today. If you haven’t used a pair yet I highly recommend it and you can listen to my episode on VR and AR if you’re curious about how they will impact jobs, but the interesting part is that the technology is definitely shifting towards smaller and more comfortable products. Its very probable that today’s headsets will be seen like the large block like cell phones of the 1970s. The same is happening with exoskeletons, which Samsung also unveiled and proposed to be used with workout games to correct a user’s posture and track certain body metrics, in combination with the AR glasses. This is compared to the large bulky powered versions that I talked about supporting employees in industry as well as in healthcare.
Bot Chef
Another new technology presented at CES was, Bot chef. Though I’ve already mentioned robotic arms that are able to undertake cooking tasks, this pair of robotic arms has a few new tricks. Firstly, it is voice activated and starts work based on a single command, like, ‘Bot chef, make a salad’. Secondly, the arms can function independently, such as one dicing ingredients for a salad while the other prepares a pot of coffee, and thirdly is has collaborative aspects built in, such that while is was holding a knife, the human assistant walked towards it and the sensors in place alerted the arm to slow down as the assistant approached engaging it’s safety measures, even releasing a buzzing sound to alert the assistant of the knife and safety risk. Though Bot Chef is currently dependent on a human co-chef, one can imagine that professional chefs as well as anyone cooking at home could be impacted should this technology become affordable for the average consumer.
Neon’s Artificial humans
One if not the largest hit of CES was from a startup called NEON. The company showcased ‘artificial humans’ this is similar to the ideas presented in the last episode before the holiday break with digital humans and holograms replacing people in entertainment. Neon’s ‘artificial humans’ are simply digital avatars that look and act like real humans, but displayed on a screen. The excitement was generated due to the fact that they appeared on the wall as yoga instructors, bankers, pop stars, news anchors and fashion models. The idea is that a more human look is being added to the growing use of digital AI assistants – Someday a NEON artificial human might check you into a hotel. Or become the face of a virtual chauffeur in your future self-driving car. Though the hype surrounding this tech was dampened by the fact that the lifelike models were simply human actors and just a portrayal of what the company hopes to accomplish. But the future digital humans will be able to generate countless facial animations as well as speak multiple languages. This appears to be a more tangible human layer that can be added to the growing AI systems like digital assistants enabling an easier uptake of AI technologies in the future.
Toyota’s city of the future
But most interesting I think was when Toyota revealed plans to build a prototype “city” of the future on a 175-acre site at the base of Mt. Fuji in Japan. Called the Woven City, it will be a fully connected ecosystem powered by hydrogen fuel cells and is planned to be fully sustainable, with buildings made mostly of wood to minimize the carbon footprint. Residences will be equipped with the latest in human support technologies, such as in-home robotics to assist with daily living. The homes will use sensor-based AI to check occupants’ health, take care of basic needs and enhance daily life. This is coupled with the fully autonomous transportation system that will be embedded in Woven. However, it won’t be ready for some time, the groundbreaking will only start in 2021, but it will be an interesting project to follow as it links to many of the things I’ve talked about on the podcast, so i’m quite curious how the day to day life of its residents will be different, for better or for worse. Also curious to hear what you think about this, let me know on twitter if you want.
The Capitalism of Things
Linked to this automated and connected future city we can look at the topic of this week’s episode, the Capitalism of things (Back to the Future? From time-based to task-based work By Marina Gorbis An article from Future Now magazine, 2016). Though, like many of the technologies showcased at CES, the implementation is only in its initial phase and may even never be fully realised. However, it is one that is very much connected to this shift on the automation of human tasks. The capitalism of things was hinted at in a few episodes, especially the episode on the internet of things, which is where the term is derived from. The internet of things if you recall is where sensors enable various objects to become connected to the internet, share and receive data and give us a more precise understanding about physical characteristics. I used the example of agriculture where sensors in the ground can tell a modern farmer the moisture level of the soil indicating whether a crop needs to be watered or not. To understand possibly where capitalism of things will come from let’s look at 3 successive layers. The first can be seen through the emergence of new networking technologies which has created task-based work. Think of platforms like Fivrr or Upwork where you can hire freelancers to take on specific tasks like develop a logo, translate an article, edit a video etc. This has enabled large projects to be broken down and distributed across multiple individuals. Artificial intelligence has pushed this facilitating those best suited to complete those actions to be paired with those needing the service without an intermediary to make the connection. Someone needing a logo designed can be guided to the designer that suits the price, time, and quality needs through these online platforms, also think of Uber connecting a driver with a passenger without the need to call a central agency and arrange a pickup.
Second, when more autonomous systems are in place humans are removed from the production side of the equation. For example looking at Uber again, the passenger will be the only human in the interaction once Uber adopts autonomous vehicles, which it has been looking into heavily. When this happens both the driver as well as the matchmaker will be autonomous systems, and only the consumer, the passenger in this case, will be human.
The third layer is where the capitalism of things is introduced. Here, smart objects and systems exchange value and trade services autonomously. Going back to the agricultural example, think of the sensors around the crops detecting ripeness, relaying that data to autonomous harvesters (like the ones being tested in California right now), and the packaged food being shipped by autonomous vehicle to stores. The part that is interesting is that many stores now are using sensors to determine when levels of products are low or empty, and that data is transmitted to have the stocks replenished, without the use of employees scanning barcodes in aisles. So in this example, the store, transportation system, and farm are communicating and exchanging value and goods autonomously. This same principle is being tested in smart homes, and will certainly be a key aspect of Toyota’s ‘future city.’ Smart kitchens, and cabinets could interact directly with amazon or other goods providers and request a stock up of household products from toilet paper to lettuce to shampoo without any human request. Agriculture and homes are but two examples, the idea of the capitalism of things is that these smart self communicating systems could interact in many industries and perhaps account for a greater value of economic transactions than those done by individuals …if the systems in place work as designed of course. This is of course rather speculative and not going to be implemented tomorrow. But the technologies required to make this happen are in their early stages. The privacy, security, and regulatory concerns are far from dealt with, but nevertheless it is an interesting idea that has hints of taking shape today.
That’s it for the first episode of 2020. As I mentioned in the last episode of 2019 the topics will now focus away from specific technologies and more towards the overall impact of automation. Next week will look at the jobs that have disappeared and the jobs that are coming out.