Tiago Albuquerque, a Deliveroo rider, rolled up to a home in Ballsbridge one Thursday evening to deliver a Chinese takeaway. He knocked, rang and waited.
The customer who had ordered it through the food delivery app didn’t come out to collect it.
They didn’t pick up when he rang either. “No one answered the door, no one answered the phone,” he says.
Albuquerque tapped open the app, and was given two options, he says: to leave the food, or take it with him.
“What I learned the hard way was that the fact I said I was taking the food with me, Deliveroo didn’t pay me,” says Albuquerque. To him, he says, it seemed like the food was considered his payment.
This is just one of the problems that workers have with the algorithm that drives the Deliveroo app, says Albuquerque.
In other ways, too, it can fail to respond to situations in the same way a human manager would be able, he says.
Fiachra Ó Luain, an organiser with the English Language Students’ Union, which is helping to draw attention to the conditions of gig-based workers, says other potentially more dangerous situations also crop up because of this algorithm acting as a manager.
It often entices people to work in precarious situations, failing to account for staff safety, he says. “If I’m a manager, I have responsibility for the working conditions of my staff.”
In January in Italy, a court found that an algorithm that Deliveroo uses discriminates against some workers, as it punished them for failing to show up to work by giving them fewer job opportunities in the future and didn’t recognise valid reasons such as being sick or being on strike.
Even if Deliveroo as a company didn’t want to discriminate against these workers, it’s still indirect discrimination, says Valerio De Stefano, a professor in labour law at KU Leuven in Belgium who specialises in artificial intelligence and labour regulation.
“It’s indirect discrimination because the algorithm didn’t take into account that you could miss a shift for a valid reason that’s protected by the constitution,” says De Stefano, noting the significance of this case across Europe in legally questioning in court how algorithms work.
“We have a dedicated rider team to help riders with any concerns they may have and ensure that they feel supported at all times,” said a spokesperson for Deliveroo, in response to questions sent last Friday.
Riders are paid for all completed orders once they’ve left the food in a safe place, they said.
Like Albuquerque, delivering food is not Gabriel Mont’s main job.
Mont has accounts for the three main delivery apps in the city: Deliveroo, UberEats and Just Eat.
Deliveroo is his preferred app at the moment, he says. He sometimes uses UberEats and has never used Just Eat.
The main issue with the UberEats app, he says, is that it doesn’t tell you where you need to drop off a delivery until you click to accept the delivery job.
“One time I got a delivery that was 10km from here,” says Mont. “But then you have to come back from far away as there’s no orders over there and then start working again.”
This is a problem, says Mont, as you get a bit more pay for longer-distance deliveries but not as much as you’d get from dropping off a larger number of orders within a smaller area.
But there’s bigger problems to this late reveal, says Albuquerque. Since you don’t know where you’re going, you might not realise you have to make a delivery that is potentially unsafe for you until after you have collected the food.
When Level 5 restrictions came into effect in Dublin last October, Albuquerque began to notice a shift in how delivery riders were treated.
Riders felt increasingly vulnerable in certain areas, he says, as they were such visible frontline workers on the city’s deserted streets.
“We are in the streets very visible with expensive bikes. That’s very tempting for them,” he says – with “them”, in this case, referring to those attacking delivery cyclists.
The week before last, a €16 payment popped up on the Deliveroo app to deliver to the Convention Centre at Spencer Dock, says Albuquerque – something that he’s never seen before.
“It’s because people are refusing to go there,” he says, as they don’t feel safe – yet the app is enticing them to go there because the algorithm doesn’t understand that.
This is especially an issue for newbie riders, says Ó Luain. They often have a poor understanding of the city, are in more precarious financial situations, and often have visa restrictions on working.
Bigger payments to areas where other, more savvy riders refuse to go become more enticing for them, he says.
“A newbie mightn’t realise that,” he says. “They might be freaking out about rent or whatever.”
“A supervisor has to understand the consequence of what’s happening,” says Ó Luain, adding that if a supervisor sees working conditions as being unsafe they’re responsible for that decision. An AI is different, he says.
Albuquerque says the same. Refuse to deliver to an area and the app asks you why you rejected the order, he says. One of these options is “dislike the delivery area”.
“If you choose this option, a few minutes later the application is going to throw up an order for you in the same area,” he says, suggesting to him that the AI doesn’t listen to riders.
A spokesperson for Deliveroo said: “The safety of riders is our absolute priority and we take every step to ensure rider’s feel safe when on the road.”
Riders can raise safety concerns about specific delivery areas via the rider app, they said.
“We never want riders to take any risks and will continue to clearly communicate this to riders, along with our safety guidance and road safety campaigns,” they said.
There’s yet another way in which Deliveroo’s platform, which works off an algorithm called Frank, hurts worker safety, says Ó Luain.
It undermines efforts to organise worker demonstrations to highlight the threats that they face when delivering, says Ó Luain.
“[W]hen there is greater demand for deliveries than there are supply of delivery workers they will offer you up to 1.5 boost – so instead of €3 you’ll get up to €4.50,” says Ó Luain.
During strikes at the end of January, workers turned off their Deliveroo app, meaning that there was more of a demand for workers and, automatically, the payments for delivery went up.
O’Luain describes this technology as “scabtech”, as it actively encourages workers to break strike actions by paying more during times when the supply of labour is low.
Tech can be used for union busting, says De Stefano, the professor in labour law, and although he wouldn’t be surprised if the application works as such, he can’t say for certain that such a mechanism is written into the app.
“We know that normally platforms are not very eager to have a unionised workforce,” he says, noting Amazon’s anti-union stance.
A spokesperson for Deliveroo said: “We fully support the right for riders to express themselves and their concerns by choosing not to work.”
Riders work independently and the work is flexible so each rider has the choice whether to work or not, the spokesperson said.
In Italy, the rights to strike and to take sick days are constitutionally protected, De Stefano says.
“And so if you penalize someone if they miss a shift, even though they were exercising a constitutional right, this is why you are discriminating,” he says.
That’s why the Italian court found earlier this year that an algorithm that Deliveroo uses discriminates against workers.
In Ireland, the Deliveroo app works slightly differently for riders, says Albuquerque. Here you do not need to pre-book in advance the time slot you are going to work – you just turn the app on instead and wait for the first delivery to pop up, he says.
A spokesperson for Deliveroo said that the algorithm the Italian court ruled on is an old booking model and is no longer in use.
“The decision was hypothetical only – and no findings were made relating to any riders who used the system. This was a single case of a system that is no longer in use,” the Deliveroo spokesperson said.
The significance of the Italian court’s decision under European law was to take into account the seeming objective neutrality of an algorithm and decide it was acting in a discriminatory way, says De Stefano.
“The point is that you never know what’s going on with the algorithm so you never know how the algorithm really factors any kind of behaviour or objective that the platform has,” he says.
Ó Luain can see the relevance of the court’s findings here. “You can’t outsource those management functions to an algorithm and expect them to be grand,” he says.