Internet of Robotic Things
If you watch too many films like I, Robot, Blade Runner or the Terminator series, you might imagine an apocalyptic future in which humanity is pitted against robots in a struggle for supremacy or even existence. Alternatively, you may decide that in reality, AI is a very long way from an ability to fulfil the SF writers’ dreams or nightmares.

However human-robot co-existence pans out, one thing’s for sure – the ‘future’ that’s already with us is more startling and exotic than many people realise. This is true of both robotic development, and of the relationships between people and robots.
For example, some oil rigs or pipelines are benefitting from robotic snakes capable of wriggling through the depths of the sea to perform underwater inspection, maintenance and repairs.
Back on land, Boston Dynamics is demonstrating a humanoid robot that can perform incredible feats of strength and agility, with both jumps and backflips.
The company has produced many robots that act eerily, and sometimes frighteningly, like humans or animals. In Dubai, there are major ambitions to become a smart city, with drones and robots central to local development plans. The city has already trialled a drone that it hopes will facilitate a viable airborne transportation system, possibly within five years.
Robotic-human relationships are developing as well. After becoming tired of the pressure to marry, Zheng Jiajia’s solution was to wed Yingying, a robot he built himself – but only after two months of ‘dating’. And at a tech summit, Saudi Arabia decided to award citizenship to Sophia, a robot built by Hanson Robotics. While this was a publicity stunt, the question of whether we should be giving robots rights is a big one.
This article looks at where we are with robots today, and where we may be headed. It starts by reviewing robot types of widely varying sizes – from those as tall as a house to devices that can manipulate molecular cargoes – to show the ubiquitous extent of today’s robotic landscape, and its themes for ongoing development. This review includes the special case of humanoid robots and androids; it also considers drones as a type of robot.
Next, we extrapolate this robotics landscape by considering the new opportunities offered by integrating robots with the IoT to create an Internet of Robotic Things (IoRT), the realities of artificial intelligence (AI), and the possible impacts – adverse or beneficial – on the future of jobs.
Finally, we provide a bridge between these ‘big picture’ considerations and today’s immediate environment by offering some examples of off-the-shelf kits available today; these allow engineers to explore robotic possibilities without requiring a corporate-sized budget.
Robot types
Robotic wars and competitions: Some of the largest robots today are the giant piloted fighting robots produced by MegaBots, Inc. to fight in stadium-sized arenas. These 15-foot-tall humanoid robots fire cannonball sized paintballs at one another at speeds of over 120 mph. The last robot left standing is the winner.

An innovative approach to building – the Digital Construction Platform: In just half a day, a new type of robot built an igloo-shaped building half the diameter of the U.S. Capitol dome—all by itself. In the future, such autonomous machines could assemble entire towns, create wacky Dr. Seuss–like structures, and even prepare the moon for its first human colony.
Developed by a team from the Massachusetts Institute of Technology’s (MIT’s) materials science and design focused Mediated Matter lab in Cambridge, the Digital Construction Platform consists of a large hydraulic arm on motorized tank-like treads.
At the end of this robotic arm is a smaller electric arm for finer movements, complete with a suite of sensors for positioning and stability control, along with swappable tools for welding, digging, and printing. The combined reach of the arms is more than 10 meters. The robot also carries solar panels and batteries, and an electronic tip which sprays a line of expanding foam to print the structure.
Factory robots: Statistics from the International Federation of Robotics show that 253,748 robots were sold in 2015. One-third of these went into the automotive sector, 25 percent to electrical and electronics companies, and 12 percent to the metal and machinery industry. The remainder ended up in industries as diverse as aerospace, food packaging and pharmaceuticals.
Large robots are finding their way into smaller enterprises as advances in robotic technologies and lower costs are removing barriers to implementation for control and automation applications.
In assembly, components are increasingly presented to the robot through vision systems, while force sensing lets it adjust and adapt to tight fits just as a human worker would – in essence, robots are becoming more dexterous.
Increasingly, robots’ flexibility and ease of reprogramming for new designs or production lines makes them a low-risk investment for SMEs.
A new wave of robots, far more adept than those now commonly used by automakers and other heavy manufacturers, are replacing workers around the world in both manufacturing automation and distribution.
For example, in Philips Electronics’ factory in Drachten, the Netherlands, 128 robots, guided by video cameras, perform feats well beyond the capability of the most dexterous humans. One robot arm endlessly forms three perfect bends in two connector wires and slips them into holes almost too small for the eye to see. The arms work so fast that they must be enclosed in glass cages to prevent the people supervising them from being injured. And they do it all without a coffee break — three shifts a day, 365 days a year.
Drones: Most people imagine a drone as a solitary, remote-controlled toy with propellers, or perhaps a large, unmanned military aircraft. The future’s reality, though, could be strikingly different. According to a BBC ‘Futurenow’ report, drones are becoming smaller, cheaper to make, and will start swarming in groups of hundreds or even thousands, to fly like a flock of birds.
On the battlefield, such swarms could outperform weapons and technology that militaries have used for decades. In a congested city, teams of tiny quadrotors could buzz around to gather intelligence. Tank battalions could be overrun by miniature attack drones diving in from all directions at once. Many might be shot down, but others might make it through to destroy the tanks.
Swarms have already been deployed: 300 drones assembled into an American flag in Lady Gaga’s Super Bowl halftime show, illuminating the night sky.
In the future, swarms could also check pipelines, chimneys, power lines and industrial plants cheaply and easily.
On the farm, they can spot plant disease and help manage water use, or spray pesticides and herbicides only in the exact spot needed, all working cooperatively to cover the area and fill in gaps.
At an even smaller scale, Harvard’s Wyss Institute’s RoboBee project is developing tiny drones smaller than a paper clip and weighing a tenth of a gram. Thousands of RoboBees could be used for weather monitoring, surveillance, or even crop pollination as honey bee numbers decline.
DNA Robots:

While RoboBee may sound like a tiny implementation of a robot, it’s by no means the smallest. Researchers from the California Institute of Technology in Pasadena have found that
miniature robots with arms and legs made of DNA can sort and deliver molecular cargo. These DNA robots could shuffle nanoparticles around on circuits, assemble therapeutic compounds, separate molecular components into trash for recycling, or deliver medicines where needed in the body.
Humanoid robots and androids: Some definitions differentiate these two types by saying that a humanoid robot merely approximates to human form, while an android is designed to mimic a human as closely as possible.
According to this view, humanoids are built with the same basic physical structure and kinetic capabilities as humans but are not intended to really resemble people. They may have jointed arms and legs, for example, which can move in the same ways that human limbs do, but have a plastic or metal exterior that in no way mimics human appearance. Motors and hydraulic lines may be visible. Examples of this type of android include Aldebaran Robotics’ Nao and Google-owned Boston Dynamics’ Atlas robot.
Atlas is the latest in a line of advanced humanoid robots that Boston Dynamics is developing. Atlas’ control system coordinates motions of the arms, torso and legs to achieve whole-body mobile manipulation, greatly expanding its reach and workspace. Atlas’ ability to balance while performing tasks allows it to work in a large volume while occupying only a small footprint.
By contrast, androids resemble humans so closely that they could be mistaken for living people; this type of android is often modeled on live humans. Eve-R, from the Korea Institute of Industrial Technology (KITECH) and Geminoid DK are two examples of this.
The University of Pisa’s International Research Center ‘E.Piaggio’ is researching ‘Emotional Human Robot Interaction’ using human-like robots which embody emotional states, empathy and non-verbal communication. The research group is using a life-like android called FACE (Facial Automation for Conveying Emotions), developed in collaboration with Hanson Robotic, which presents emotional information through facial expressions to study the human-robot empathic link. FACE is part of a complex Human Interaction Persuasive Observation Platform (HIPOP) able to collect synchronized information acquired from physiological, psychological and behavioral data sensors. Thanks to its modularity, HIPOP allows scientists to configure different experiments selecting the number and the type of available modules to follow protocol requirements.
The Internet of Robotic Things
The Internet of Things (IoT) is bringing us unprecedented insight into and control over the world about us; in our homes, factories, offices, city infrastructures, farms and more. It does so by connecting large numbers of smart edge devices to powerful, cloud-based computing and analytics resources.
Meanwhile, Telefonica has described robots as machines that exhibit intelligent behaviour as they sense and interact with their environment. What if we combined these entities – the IoT and robots – into a new ecosphere? Giving robots an internet connection adds an enormous source of information to support robot decision-making and interaction.
The next logical step is for this ubiquitous connectivity to improve smart devices that not only get the job done, but also mesh to create a combined intelligence and determine a best course of action for the devices involved.
The concept of integrating teams of robots and the IoT has been named as ‘the Internet of Robotic Things’, or IoRT. ABI Research defines the IoRT as “intelligent devices that can monitor events, fuse sensor data from a variety of sources and use local and distributed ‘intelligence’ to determine a best course of action’. Robotic principles of sensing, movement, mobility, manipulation, autonomy and intelligence are enhanced by The Internet of Things.
Robotics scientists no longer have to invest huge amounts of time, energy and money in recognition capabilities for robots, as the IoT provides reusable and open information that robots can access to carry out their tasks. These connected IoRT robots are just the logical evolution of robotics.
Transforming the machine-to-machine concept into robot-to-robot seems a natural evolution as robots are expected to perform jobs in a more effective, accurate and reliable way the same way we expect m2m technologies to provide superior results over traditional industrial control and automation processes.
Amazon fulfilment warehouses – a practical application of the IoRT: As an article in Information Week points out – “While Amazon’s drone delivery program and its future potential receives plenty of coverage, the real magic of robots and the IoT is happening in their vast fulfilment warehouses”.
Instead of running an endlessly repetitive production line, Amazon, like other retail fulfilment operations, has a business model where every order is unique. They are handling thousands if not millions of products, all with varying sizes, weights and shapes. Previously, to fulfil an order, warehouse workers had to roam the floor, scanning racks of merchandise to locate each specific product. This activity has been replaced by robots that move the racks, or ‘pods’ that store the products, to where the workers need them.
The robots are controlled by a central processor using a secure WiFi communication network. They have two powered wheels that allow them to rotate in place, IR for obstacle detection, and floor cameras to read QR codes in the ground. These QR codes inform the robot of their location and direction.
The robotic warehouse owes its success not to the robots, but the intelligence behind the system. Amazon processes hundreds of orders per second, and when the customer clicks the ‘buy’ button, the order enters a sophisticated fulfilment system, which locates the products in different Amazon delivery centres. Once the order is organised, the robots locate and move pods to assigned packing stations to allow preparation for shipment.
Towards truly intelligent robots: the progress of artificial intelligence

Everyone knows that robots are providing powerful and flexible solutions to an ever-increasing range of applications – but how truly intelligent, in a human sense, are they, or could they become?
This depends on the artificial intelligence, or AI, that drives them. As an article by ‘Howstuffworks’ points out, ultimately AI would recreate the human thought process. This would include the ability to learn just about anything, reason, use language and formulate original ideas. Roboticists are nowhere near achieving this level of artificial intelligence, but they have made a lot of progress with more limited AI. Today’s AI machines can replicate some specific elements of intellectual ability.
For example, a computer can solve problems by gathering facts through sensors or human input. It then compares this information to stored data and evaluates its meaning. Next, it runs through various possible scenarios and predicts which action will be most successful. It can only apply this to problems it’s programmed to solve – playing chess, for example.
A robot can learn, for instance, by recognising if a certain action like moving its legs in a certain way achieves desired results in navigating obstacles. The robot stores this information and attempts the successful action the next time it encounters the same situation. However, this ability is limited; they can’t absorb any type of information as a human can.
Some robots can interact socially. Kismet, a robot at M.I.T.’s AI Lab recognises human body language and voice inflection and responds appropriately. Kismet’s creators are interested in how humans and babies interact, based only on tone of speech and visual cue. This low-level interaction could be the foundation of a human-like learning system.
Because natural intelligence is still so little-understood, AI research is largely theoretical. Scientists hypothesize on how and why we learn and think, and they experiment with their ideas using robots. The M.I.T team focus on humanoid robots because they feel that being able to experience the world like a human is essential to developing human-like intelligence. It also makes it easier for people to interact with the robots, which potentially makes it easier for the robot to learn.
See our article “AI’s place in the IoT infrastructure” for more discussion on AI developments.
The rise of the robots – a benefit or blight?
Will the rise of robots and AI ultimately bring benefits to society, improving quality of life – or will the result be misery as vast numbers of jobs disappear?
According to the IET’s Engineering & Technology magazine , the issue is deeply concerning to governments, workers and even industry leaders. There have been some truly dire warnings; Analyst group PwC estimates automation will take 40 per cent of US and 30 per cent of UK jobs by 2030, for example. Bank of England chief economist Andy Haldane has said that up to 15 million jobs in Britain could go to robots.
It’s being taken seriously, too, by industry leaders from Bill Gates (robots should pay taxes) to Elon Musk (AI will lead to world war three). Governments have started to seriously think about industrial and fiscal policies to slow down the march of the robots, as well as position nations to take advantage of what could be the next big industrial revolution.
However, there’s optimism as well. In a survey by consultancy Capgemini, AI created new roles in 75 per cent of large companies implementing it and nearly as many can attribute a 10 per cent rise in sales to AI. A consumer survey sponsored by microelectronics design company ARM found 61 per cent thought AI and more automation will improve society rather than destroy it.
Entering the world of robotics

Robots clearly offer exciting possibilities, and in any case, they’re here to stay – but how can an engineer get to grips with this technology? An initial entry into the robotics world could be made using kits such as NXP’s FSLBOT easy-to-use mechatronics development and demonstration platform or the RP6v2 economical autonomous mobile robot system.
Both will provide experience with mechatronics development, programming and processors. The RB6v2 provides opportunities to measure light intensity, detect collisions and low battery, measure and control rotational speed of motors via high-resolution encoders, and exchange data with other robots or devices.
Conclusions
Although the speed, direction and impact of their development are hotly debated, there’s no doubt that robotics and AI are here to stay.

We’ve seen the richness, innovation and variety of robotic technologies that already exist, and the above examples offer simple entry levels to engineers and enthusiasts wishing to explore this developing world.
Given the extent of the technologies’ conceivable consequences, for good or ill, it’s important that the debate is expanded by as many opinions and viewpoints as possible.
After all, some commentators with concerns about accelerating AI-driven capabilities fear that it’s an option that may not always be available for us to exercise.