CodeReuse
ThoughtStorms Wiki
Context : OnComposability
What is the greatest challenge of software development today?
Aug 20, 2019 https://www.quora.com//What-is-the-greatest-challenge-of-software-development-today/answer/Phil-Jones-He-Him
There are so many programmers today.
And we are all doing so much work.
And most of that work is "reinventing the wheel", "stepping on each others' toes", "at cross-purposes" etc.
How much Not Invented Here syndrome do we see? Or "me too" reinventions of frameworks and tools. Because someone didn't know that a perfectly good solution already existed?
I have so many ideas for things I want to write. We have so much powerful software available. Stacks of libraries upon libraries upon operating system services that already do 99.99% of what I want to do.
And yet when I sit down to write something ... I can imagine the algorithms for the genuinely new part ... and probably sketch them out in an afternoon ... in a few dozen lines of code.
And EVERYTHING ELSE, the "unnecessary complexity", the months of dreary grunt-work, is basically "fitting in" with the existing cruft that has built up. I'm used to a framework that runs fine on Linux but not Windows. Or is great in the browser but not on the desktop. So I have to find a new framework. And if I get that choice wrong, I have the hell of an inappropriate framework. Or I run off and reinvent something didn't need to.
I have to install the tools to use the new framework. Install the dependencies. Worry about incompatibilities in packages. Incompatibilities with the Java version, with the operating system version and other resources.
Then once I'm actually in the programming language. I sketch my algorithms out in a sensible way. But when I come to draw a picture on a canvas I have to learn a new API because the way I learned to draw in another context doesn't carry over. Or I have to learn how to connect to a new kind of database which does the same thing as another I have used, but uses a different vocabulary from the old one.
When I want to put my application as a web service, it doesn't scale and isn't secure. Despite the fact that scaling and security are widely studied and have been implemented properly dozens of times, they still aren't just available to me trivially. I have to do the work myself because reusing techniques doesn't fit into the format of an include-able library. Or I have to adapt myself to an existing re-usable framework.
Pattern languages exist because patterns can't be reused with an "include" statement, only by programmers reading "how to do this" in a book and then doing it themselves.
Seriously guys,
the world is full of millions of working programmers today. Spending billions of hours creating code.
And almost all of that code is unnecessary
But we simply don't know how to co-ordinate with each other to properly reuse code.
This is a problem that goes by different names. Reusability. "Composability" is another nice high-level abstract concept for the challenge. Why can't we compose our 0.1% of original idea with, and re-use, the 99.9% of work that already exists?
As Richard Kenneth Eng would note. Smalltalk tries to solve this problem by dissolving "monolithic" applications / operating systems into a single large collection of objects, like a big box of reusable Lego bricks that can be reused and recombined by each new application. It improves productivity because in Smalltalk world there is little need for anyone to reinvent existing functionality. It is already there visible, available, waiting to be reused.
Unfortunately we don't live in Smalltalk world. All our operating systems and cloud providers etc. work on a different model. And there seems no way to bring this Smalltalkness out into the rest of the world.
So at the next level we have the open-source movement, git and public hosting services like GitHub / GitLab etc. And package managers like Debian and npm and Maven etc. All ways to share and reuse code.
They help. But they don't help enough to actually make a dent on the problem. The number of reinventions keeps on proliferating. The cruft builds up.
The Haskellers and advanced type system people think they have the answer to composability. Better type-systems. Which allow us to know exactly when and how different bits can be combined together. You could presumably even have search-engines that could figure out when two hosted projects could be plugged together, via their type compatibilities. Or which could explain exactly what kind of adaptors you'd have to write in order to make them work together.
But then again, we aren't all using Haskell. And importing our existing code into the Haskell type-system is a big and unlikely task.
Perhaps artificial intelligence can help. Bots can scan all those public repositories of code, building models of the codebases, of the data-structures created by them. We can infer Haskellish data-types from the code. And then use that to help.
But without humans involved working to those data-types the result is still likely to be a lot messier and unwieldy than if humans actually co-ordinated and co-operated to begin with.
So this is the big challenge. As the population of programmers explodes, and the amount of existing software - both source code available to use in code-repositories, and depths of the stacks we actually use - grows, how do we stop all our gigantic efforts being wasted by reinventing, working against each other, and fighting cruft, technical debt and (in)compatibility issues?
Backlinks (3 items)