AI: Powering sorting sophistication in a data-driven future - letsrecycle.com

2022-09-12 05:57:14 By : Ms. Chloe LYU

Artificial intelligence (AI) has become something of a buzzword given its applications in a broad array of industries. AI mimics the cognitive abilities of humans, but it would be a stretch to attribute the term to any technology that performs a task independently. Experts at AMP Robotics look at some of the issues around AI and recycling.

AI is a collection of algorithms that enable a computer to learn from experience. It’s an approach to teaching machines to do what humans can do. Humans are adept at identifying objects from visual inputs, whether it’s an image or views of the real world, and it’s this ability that enables us to perform complex tasks—like sorting materials in a recycling facility.

Machine learning is a subset of AI, and deep learning is a type of machine learning. Neural networks are deep learning technologies where a system learns how to build patterns out of features and perform a task by analyzing examples.

For example, AMP’s proprietary AI technology, AMP Neuron, works by perceiving images of recyclable materials on conveyor belts within recycling facilities. Looking for specific colors, shapes, textures, logos, and more, the system recognizes patterns correlated with material type. Neuron digitizes these images and uses the data generated to infer in real time the recyclable materials and contaminants in sortation environments.

Recycling involves infinite variability in the kinds, shapes, and orientations of the objects on a conveyor belt, requiring nearly instantaneous identification along with the quick dispatch of a new trajectory to the robot arm used to sort the material on the belt. Training a neural network to detect objects in the recycling stream is not easy—but it’s an entirely different challenge when you consider the physical deformations that these objects can undergo by the time they reach a recycling facility. They can be folded, torn, smashed, or partially obscured by other objects.

Ultimately, the neural network learns to identify objects in the same way a human does. It’s this ability to construct a “model” of an object using simple building blocks like material features that makes deep learning so powerful. AMP’s team of approximately 100 data annotators examines these images and categorizes the objects belonging to each material type. We continuously collect random samples from all the facilities that use our systems, and then annotate them, add them to our database, and retrain our neural networks. We also test our networks to find models that perform best on target material and do targeted additional training on materials that our systems have trouble identifying correctly.

(above: a YouTube video from AMP of a sorting robot picking mixed plastics)

Consider high-density polyethylene (HDPE), a plastic commonly used for household cleaning products. Batches of HDPE tend to be mixed with other plastics and may have paper or plastic labels, making it difficult to detect the underlying object’s chemical composition. An AI-driven computer-vision system can determine that a bottle is HDPE and not something else by recognizing its packaging. Such a system can also use attributes like color, opacity, and form factor to increase detection accuracy, and even sort by color or specific product, reducing the amount of reprocessing needed.

But why is AI preferable to the human eye in identifying this material? A human can instantly identify its function and characteristics: it’s a household container used for a specific task, it’s a HDPE bottle, and it’s recyclable. What can the deep-learning-based algorithms recognize in that same split second? AI can identify and categorize it as a container, tell that it’s opaque, the polymer is HDPE, and it’s colored.

The AI knows because it uses machine learning to find patterns in materials recovery facility (MRF) images. It sees the packaging at all angles, including all levels of dirt, grime, and damage, and it can pick without seeing the entire object. More identification parameters equate to more identification accuracy. With specific material properties, you can target the material as HDPE colored or natural as a positive pick, or remove as contamination if it’s non-food grade or has external contamination. The integration of AI into recycling operations provides a more detailed picture of that one digitized object—along with every other object the AI perceives.

AI accuracy and performance is built on training a platform with as many real-world images as possible—and we’ve seen the most materials, in more permutations. In fact, AMP now has the world’s largest data set of recyclable material images for use in machine learning. Neuron recognizes 50 billion objects on an annual basis—a number that continues to exponentially increase as our install base expands. A large install base translates to a stronger network that continuously improves. This object recognition figure encompasses hundreds of categories of paper, plastics, metals, and other materials, separated further by form factor, color, grade, and more. Companies can build AI platforms in lab environments, with clean, controlled materials and belt speeds, but they become truly valuable when they identify dirty and dynamic materials in operational MRFs.

AI is the enabling technology essential for generating and capturing the data to optimize recycling operations. Vision-based AI software identifies and characterizes objects in the waste stream in real time, digitizing each item that passes by. These objects are captured as a new form of data, including object counts, packaging descriptions, and more, over time. AMP has deployed hundreds of robots and sensors that process billions upon billions of objects on conveyor belts in MRFs. We continue to expand our material categories by exploring and developing subcategories useful for MRFs as well as for downstream recyclers. If a challenging packaging type or new material emerges, we’re able to capture imagery and train the AI to identify the object.

The digitisation of scrap objects in recycling opens up many potential applications. The first two deployed in MRFs today are robotic sorting and the descriptive and diagnostic analytics provided by standalone sensors. As the sensors become distributed throughout a MRF, we’re able to help MRFs become more data-driven facilities to reduce costs and increase revenue. Ultimately, expanding the use of AI in recycling will increase recycling rates and landfill diversion by making valuable materials easier to reprocess and reuse.

(Note: This Special Report was contributed by AMP Robotics)

Our address is: letsrecycle.com, Roar B2B, First Floor, 115 Southwark Bridge Road London SE1 0AX

© Environment Media Group Ltd 2022. Company registration number 03959158. Registered office: 55 Station Road, Beaconsfield, Bucks HP9 1QL

We are using cookies to give you the best browsing experience on our website.

You can find out more about which cookies we are using or switch them off in settings .

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.

If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.

This website uses cookies to collect anonymous information such as the number of visitors to the site, and the most popular pages.

Keeping these cookies enabled helps us to fund and improve our website which is free to visit and use. All such information remains confidential and we use only to determine which pages are popular with readers.

Please enable Strictly Necessary Cookies first so that we can save your preferences!

More information about our Cookie Policy