Skip to main content

Techie found an error message so rude the CEO of IBM apologized for it

2 weeks 4 days ago
Big Blue turned the air blue

Who, Me?  Oh, bother, it's Monday. But rather than curse about another working week rolling around, The Register welcomes it with another instalment of Who, Me? It's the reader-contributed column in which you confess to workplace whoopsies and reveal how you survived them.…

Simon Sharwood

CodeSOD: A Monthly Addition

2 weeks 4 days ago

In the ancient times of the late 90s, Bert worked for a software solutions company. It was the kind of company that other companies hired to do software for them, releasing custom applications for each client. Well, "each" client implies more than one client, but in this company's case, they only had one reliable client.

One day, the client said, "Hey, we have an application we built to handle scheduling helpdesk workers. Can you take a look at it and fix some problems we've got?" Bert's employer said, "Sure, no problem."

Bert was handed an Excel file, loaded with VBA macros. In the first test, Bert tried to schedule 5 different workers for their shifts, only to find that resolving the schedule and generating output took an hour and a half. Turns out, "being too slow to use" was the main problem the client had.

Digging in, Bert found code like this:

IF X = 0 THEN Y = 1 ELSE IF X = 1 THEN Y = 2 ELSE IF X = 2 THEN Y = 3 ELSE IF X = 3 THEN Y = 4 ELSE IF X = 4 THEN Y = 5 ELSE IF X = 5 THEN Y = 6 ELSE IF X = 6 THEN Y = 7 ELSE IF X = 7 THEN Y = 8 ELSE IF X = 8 THEN Y = 9 ELSE IF X = 9 THEN Y = 10 ELSE IF X = 10 THEN Y = 11 ELSE IF X = 11 THEN Y = 12 END IF END IF END IF END IF END IF END IF END IF END IF END IF END IF

Clearly it's to convert zero-indexed months into one-indexed months. This, you may note, could be replaced with Y = X + 1 and a single boundary check. I hope a boundary check is elsewhere in this code. because otherwise this code may have problems in the future. Well, it has problems in the present, but it will have problems in the future too.

Bert tried to explain to his boss that this was the wrong tool for the job, that he was the wrong person to write scheduling software (which can get fiendishly complicated), and the base implementation was so bad it'd likely be easier to just junk it.

The boss replied that they were going to keep this customer happy to keep money rolling in.

For the next few weeks, Bert did his best. He managed to cut the scheduling run time down to 30 minutes. This was a significant enough improvement that the boss could go back to the client and say, "Job done," though it was not significant enough, so no one ever actually used the program. The whole thing was abandoned.

Some time later, Bert found out that the client had wanted to stop paying for custom software solutions, and had drafted one of their new hires, fresh out of school, into writing software. The new hire did not have a programming background, but instead was part of their accounting team.

[Advertisement] Keep the plebs out of prod. Restrict NuGet feed privileges with ProGet. Learn more.
Remy Porter

What Happens When AI Directs Tourists to Places That Don't Exist?

2 weeks 4 days ago
The director of a tour operation remembers two tourists arriving in a rural town in Peru determined to hike alone in the mountains to a sacred canyon recommended by their AI chatbot. But the canyon didn't exists — and a high-altitude hike could be dangerous (especially where cellphone coverage is also spotty). They're part of a BBC report on travellers arriving at their destination "only to find they've been fed incorrect information or steered to a place that only exists in the hard-wired imagination of a robot..." "According to a 2024 survey, 37% of those surveyed who used AI to help plan their travels reported that it could not provide enough information, while around 33% said their AI-generated recommendations included false information." Some examples? - Dana Yao and her husband recently experienced this first-hand. The couple used ChatGPT to plan a romantic hike to the top of Mount Misen on the Japanese island of Itsukushima earlier this year. After exploring the town of Miyajima with no issues, they set off at 15:00 to hike to the montain's summit in time for sunset, exactly as ChatGPT had instructed them. "That's when the problem showed up," said Yao, a creator who runs a blog about traveling in Japan, "[when] we were ready to descend [the mountain via] the ropeway station. ChatGPT said the last ropeway down was at 17:30, but in reality, the ropeway had already closed. So, we were stuck at the mountain top..." - A 2024 BBC article reported that [dedicated travel AI site] Layla briefly told users that there was an Eiffel Tower in Beijing and suggested a marathon route across northern Italy to a British traveller that was entirely unfeasible... - A recent Fast Company article recounted an incident where a couple made the trek to a scenic cable car in Malaysia that they had seen on TikTok, only to find that no such structure existed. The video they'd watched had been entirely AI generated, either to drum up engagement or for some other strange purpose. Rayid Ghani, a distinguished professor in machine learning at Carnegie Melon University, tells them that an AI chatbot "doesn't know the difference between travel advice, directions or recipes. It just knows words. So, it keeps spitting out words that make whatever it's telling you sound realistic..."

Read more of this story at Slashdot.

EditorDavid