Hang on... consider that statement. There are lots of different computer languages, and most people outside of institutional care would readily accept that most of those languages do pretty much the same things (forget Prolog. No seriously, forget it). A formal method is much more abstract than a computer language: so why are there so many? Why didn't the second1 one invented just get extended ad infinitum as new concepts were required?
Because there's a basic misunderstanding going on here: people keep obstinately thinking that the purpose of formal methods is improve software production. Bzzzzt - wrong. The purpose of formal methods is to sell books and training courses.
I think that bears repeating.
The purpose of formal methods is to sell books and training courses.
When you understand this simple fact, it becomes obvious why there are so many formal methods. I can't sell my books and training courses if my method is the same as yours, so I have to dream up more symbols to join boxes with, invent yet another name for basic stuff like state transition diagrams and entity relationship whatevers, so I can claim copyright on what's not mine and take my rightful place on the gravy train.
Want an example? Well, why do you think the gang of four took the perfectly well understood, existed-for-donkeys-years principle of Publish and Subscribe and renamed it "The Observer Pattern"? These - for the want of a better and more profane word - people understand the power of owning the jargon. Consultants come along, steal a grab-bag of common methods, ram them together whether they fit or not, change the names to protect the guilty (themselves), and start writing their faux-academic magnum opus to join the ill-gotten dots together. What amounts to common or garden plagiarism is justified with the mantra of "best practice". But why is it OK for them to pick and choose the best practice, but not OK for me to pick and choose? If I want to use a mixture of STDs, ERDs, Boochygrams and Kanban, why not? Why does the method have to be swallowed whole? Because if you don't buy into the thing wholesale, then you won't be buying books and... well we've been here.
Now I can already hear the people at the back murmuring: 'if there wasn't a market for this stuff, it would rapidly die right? Market forces and all that. So you must be wrong Bob'. Push from the authors can't do the job on its own - of course there is pull as well. The pull comes in two basic forms, which I can quantify as (a) people are lazy, and (b) people are paranoid.
The brain is the most resource-hungry organ in the human body. Consequently, human beings have evolved to really hate thinking2. So if I can walk up to some manager who is recruiting and say "I am Qwizziwig level 2 certified", he can disengage brain, because I'm obviously twice as good as some other guy is who is only level 1 qualified. Giving him a number absolves him of the necessity to think. Many managers think this is double-plus-good3.
Formal methods are sold to managers with two basic promises, which can usefully be classified as follows :
public - Software development speed and quality will improve!
private - You can employ cheap bozos instead of uppity expensive programmers!
Managers of software departments are basically paranoid. That's not a disorder, it's a rational reaction to impossible timescales and massively variable requirements. It's not diseased to be paranoid when they really are out to get you: and the department manager is going to carry the can when the next disaster happens. God knows we can't have directors being to blame, so it must be the middle manager, right?. A drowning man will grasp at any straw, so if someone comes along and tells him (publicly) that his timescales will go down, then tells him (privately) in the bar afterwards that he could even consider outsourcing development to Elbonia because the Qwizziwig formal method makes smart engineers not-quite-so-necessary... well who wouldn't grasp that straw?
I have seen projects using formal methods produce horrendously buggy code, and after more than thirty years and several generations of formal ballyhoo, am yet to see any project meaningfully benefit from them. My most recent experience was watching an Agile project crash and burn spectacularly from an uncomfortably close distance, all accompanied by the wails of the zealots crying that it must have failed because they were insufficiently zealous. Yes, sure guys, the cure for burns is more application of fire. Here, I'll hold your coats while you play with the matches.
If you really want to make a change that improves code and shortens timescales, then you should just implement proper design reviews. Sadly, they're in the public domain. So there's my ticket on the gravy train gone.
1. Obviously not the first, because no right-minded engineer ever uses their first attempt at anything
2. You can't use yourself as a model here. You're a nerd, right? Nerds like thinking. We also tend to have a lower probability of breeding. We all grasp how evolution works, right?
3. Literary references on a programming website. What next?