Automated undersea mapping, swarms of computer-driven drones, self-driving cars – 20 years ago, such devices could not be found outside the pages of Isaac Asimov’s novel.
Today, they are the focus of several research laboratories in the Department of Electrical and Computer Engineering at BYU. Each laboratory is headed by a dedicated faculty member, who directs research and provides guidance to undergraduate, graduate and doctoral students. Campus-based projects use sophisticated hardware and software to solve challenging robotics problems.
The Daily Universe has spent time with two of these groups: the Intelligent Multi-Action Coordination and Control Lab and the Robotic Vision Lab.
Both groups call the department’s information systems and science wing at home. Although both laboratories address challenges related to machine automation and automation control, the groups are taking different approaches and finding different applications for their work.
Kami Peterson, who has a Ph.D. in space, is a faculty member who conducts research at the Magic Lab. She explained the “big picture” of much of her work and lab work.
“Most of the algorithms we use are based on autonomous systems only,” Peterson said. “They could be robots underwater, over water, or on land.”
An algorithm is a complex mathematical formula that determines how computer systems – such as those in MAGICC robots – make independent decisions. MAGICC is developing formulas that control the movement of physical robots through space.
“A lot of what I do and teach is on the control side,” Peterson said. “How do you control the vehicles? How do you detect the path… How do you make sure that they actually follow that path?”
Peterson hinted at the huge advances in autonomous vehicle technology – drones in particular – that have developed in the past few decades.
“Some of those early drones, even just a decade later, you were trying to fly and it was almost uncontrollable,” Peterson said. “Any slight wind or movement will blow it up. Now you can basically take a drone and it’s stable enough that someone who hasn’t flown (a drone) before can go out and fly it.”
Peterson noted that advances in drone technology alone “open up the possibilities for how we can use them and make the world better.”
Jaron Ellingson, Ph.D. A student in the Department of Mechanical Engineering working at MAGICC Lab, hopes to take advantage of these developments to build a system of autonomous drone swarms based on a decentralized approach.
Ellingson explained that the drones use algorithms to estimate each other’s locations and adjust their flight paths accordingly.
He envisions companies like Amazon or UPS using this system to organize large fleets of autonomous drones. “They can broadcast their positions … and other drones can take that position … and avoid each other.”
The MAGICC lab relies on custom computer code, designed by human programmers and engineers, to control the movement of autonomous vehicles. Algorithms are fine-tuned to needs and challenges that researchers understand well. Ellingson and his squadron of drones are like a conductor and a symphony orchestra – Ellingson knows what he wants and gives performers instructions to develop just the right sound.
Robotic Vision Lab
However, in another part of the Department of Electrical and Computer Engineering, students and faculty practice the computational equivalent of free jazz.
The Robotic Vision Lab focuses on using artificial intelligence and machine learning to bring vision into robots. Their research includes self-driving cars, facial recognition and food screening.
Casey Sun, Ph.D. A student in the lab, explaining how Robotic Vision uses machine learning techniques to enable their projects. “You could collect some clean data in a lab setting…and try to fit the model to that clean data. The model would be able to learn some patterns in the clean data.”
To achieve the robotic vision, the lab uses special computer programs, called neural networks, that can learn to recognize patterns by constantly comparing examples of “clean” data produced in the lab with examples in the real world – the so-called “noisy data”.
In essence, Sun explained, the neural network is saying, “I’ve never seen this pattern before, so I’m going to change (my model).”
Faculty member DJ Lee, who holds a PhD in Electrical Engineering, leads various lab projects. He described how he hopes some of the projects in the lab will benefit campus members and global communities.
“Our facial motion authentication project can improve the safety and convenience of users. Our visual inspection automation projects improve food safety and food production efficiency. Both will have a significant impact on our daily lives,” Lee said.
Looking towards the future
Researchers from both labs have expressed their vision for how robotics can positively impact people’s lives.
“My hope is that our work in autonomy will lead to a world where people do not need to perform repetitive, dangerous or monotonous work, and can focus on more important and rewarding projects,” Adam Welker, an undergraduate student at MAGICC said.
“I also believe that drones can connect us to places. Whether that is providing support to isolated rural areas or facilitating transportation in crowded cities, drones have great potential to improve our infrastructure,” Welker said.
Cammy Peterson shared her concerns as well as her hopes for robotics. “It depends on the day. I’m always either an optimist or a skeptic,” she said.
She cited self-driving cars as an example of a technology that has achieved a lot but still has a long way to go. “It’s amazing what they’ve done, but it’s still a huge challenge. There really is no substitute for how smart humans are. That’s definitely something AI has helped me appreciate: the things we don’t even think about are really complex.”
#Robot #Campus #researchers #pushing #limits #robotics