Algorithms normally used to track aircraft, ships and other vehicles are being used to monitor space junk and predict where it will go.
Currently the US Department of Defense tracks around 17,300 objects the size of a softball or larger, orbiting around the Earth at speeds of up to seven kilometres per second.
They can cause serious damage if they collide with something else. Last year a tiny paint fleck caused a crack in a window of the International Space Station.
While you can use radar to track objects in low Earth orbit, for anything further out you need to use optical sensors, which are typically telescopes with cameras attached to them.
“The sensor works by taking an image when the telescope’s location is in darkness but the satellite is still illuminated by the Sun, so it will resemble a star,” says Travis Bessell from the Australian Defence Science and Technology Group.
Image processing software can detect which dots in an image are objects versus which are real stars, but it can’t identify the object. That’s where the algorithms come in.
“Basically, the algorithms we’re looking into can stitch the dots together over time,” Travis says.
The algorithms can determine the location of an object, join the dots from multiple images to determine its orbit, then predict where it’s going to be in the future.
Making better predictions about an object’s future location could reduce the number of space junk collisions.
The team is collaborating with researchers in the US, UK, Canada and New Zealand.