FYI: This is a rant.
This is something I've been thinking about for a few days. Every couple weeks or so on the Timcast show, Ian Crossland will sperg out and insist that sites like Facebook and Goggle and AWS should be compelled to give up the code.
This has always infuriated me. Not because I have a problem with open source (I happen to like open source), but rather because it strikes me as him, as a self proclaimed "cofounder of minds.com"... as being incredibly lazy as well as reflecting how little he actually knows about the web middleware space.
The "secret" of Facebook is 100% network effect. Any half competent app design team could hammer out Facebook's UI in a quarter, and some have. There is nothing about "the code" that is special. It's just pure network effect.
Google... once upon a time involved a bit of secret sauce in indexing, but nowadays they TOO are largely just network effect (this time from the advertisers).
But then there's AWS. That must be secret sauce in the code, right?
Well, no. And really AWS is the most interesting of the three because this is a fight that's been going on for fifty years, namely, mainframes vs boxes. It's a fight that's seen reversals of fortunes and the only certainty is that the current king will always be dethroned.
AWS doesn't do anything "new". Conceptually everything it does in hosting and running code can be traced back to products that IBM and Oracle and Unisys have been selling since the 80's (and in IBM's case, even longer). But AWS managed to strike a nerve because in the 00's the mainframes were getting pretty fucking obtuse about how difficult they were to set up and maintain.
People who think that AWS is somehow an undefeatable singularity are naive. In ten years, AWS will be what WebSphere was ten years ago, and the new hotness will be something that does everything it does but in an on-prem, physical package that can be amortized.
Just stopping to interject my usual comment. Don't say "Cloud", say "Someone else's computer" - it'll help focus your mind on the gaping holes in the cloud model.
Maybe cloud network(ing) computers or servers has a better ring to it. Definitely important to point out that there's no ephemeral place in the sky where all the data is flying around- it's either in signal transiting between dishes/cable connections or stored on a physical machine somewhere owned by someone else.
You have very much more faith in your transmission media than I do! If I'm sending anything of importance, I'm keeping it until the transmission completes and we've done some checksums dance to verify you've got all of the data. Only then does it get deleted.
So, it's either in transmission, stored on someone else's machine OR both =P
To put it very simply...
Any one of these companies (IBM, Microsoft, Unisys, Oracle, VMWare, and the Apache Foundation) COULD have strangled AWS in the cradle if they'd invented Docker in the 00's.
As it was, Docker came into being because people got fed up of waiting for every goddamn company to pull their heads out of their asses and offer os level virtualization of containerized application stacks.
At the time I was an IBM admin, so I know they had the pieces in their hands to do it if they'd wanted to, but they were too narrowly focused on how to profit from the individual products to zoom out and figure out how to sell them all as an integrated, easy to use platform.
Now they have to play catch up. And as products like Red Hat's OpenShift show (poorly), they're going to have a lot of work to do to make it user friendly. But they will catch up eventually.
They are not wrong though. Fewer features means less complexity, which means fewer bugs and tests. Your point is valid too though.
I blame the dumbing down of application UIs first on garbage programming skills, then on the demand that everything be browser-based now. Mainly on programming skills though. The modern programmers I've talked to are totally clueless on how things happen within the computer and just know how to stitch a bunch of frameworks someone else wrote together into something that works some of the time. I'm not saying don't use other framework, but they don't understand and couldn't write their own code for those things if they had do. It's kinda sad because for the most part I've never been more than an amateur programmer and understand it so much better than these supposed professionals. When my dad was doing software development 30 years ago, it was totally different--those people knew their shit.
The browser base was brilliant though. A web application is about as simple a software delivery mechanism as you can think of. Every platform has a browser, and web apps just need to be downloaded by navigating to a URL. You avoid all the problems involved in the installation and updating process for traditional software. And on top of that you have better standards for security.
It’s true that many developers are incompetent now though.
This is what Sun was thinking Java could become all the way back when they introduced applets in the 90's. They wanted to move to a world where software isn't installed, it's just delivered and executed.
Tragically, applets were comically bad. They had all the problems inherent to Flash, but on top of it they were slower too.
Yeah exactly, the browser platform is what Java was hoping to create, but they failed because their browser plugin always had horrifically bad security problems. Few could foresee JavaScript evolving in speed and power to the point that it could replace it.
No, Cloud and SaaS is focused on the application developers more than the users. It’s just a common sense thing that if you can avoid having to both maintain your infrastructure and write your software, that gives you more time and resources to work on what you actually care about.
Unless of course Amazon pulls the plug on you…
The UI thing is a separate development where designers are just trying to make things simpler for users.
For things that need to be internet-based and distributed, like websites and the like, the cloud stuff makes sense. Still, at that point it's not a lot more than shared scalable hosting. It's one of those things where I can turn up and down cloud things in a very small amount of time and cost versus having to procure and maintain hardware. Then I suppose there's also things like software defined networking, which makes some sense--although that's more cloud tech that will be used by providers and not just offloading crap to AWS.
I don't partake in whatever podcasts or videos or whatever, but the notion that cloud tech is some sort of secret that they are hiding is kinda asinine. If anything those companies are the big players because they are the most invested in data center space and should have the ability to be more cost competitive because they need a lot of the high dollar things like hardware and connectivity anyway for their own business uses. Anyone with the capital could turn up, and does turn up, competition to AWS. You just need some appropriate servers and some ISP circuits.
My personal "cloud isn't the future" rant though is the centralization of processing that is totally unnecessary. Things like all these "smart devices" that won't operate on their own despite the CPU functionality they need to do their job being at the level of a bottom of the barrel ARM CPU. It's one of the things you're starting to see abandoned too--like the Logitech Harmony product line. Cool stuff, but there's zero reason for a fancy infrared transmitter to need a cloud to work.
Why people involved him with the creation of minds I have no idea, as he has a very low resolution understanding of why Facebook is powerful and it's not because of the code, but because of their network/userbase. We could all setup a facebook clone tomorrow and you're totally right, it would serve no purpose.
As for the Cloud, it's literally both the past and the future. Whether computing is done on-premise or on someone elses super-computer is more a matter of scale of economics. Back in the day you couldn't get a good processor, so you rented the use of them remotely. Then windows popularized cheap processors, and the model moved to selling the OS directly instead of time shares for access to the OS. Whether we are doing cloud computing or on-premise computing will go back and forth constantly depending on the availability of hardware to the consumer.
That said, fuck the cloud my intent is not to rent I want to own what I value.
And the quality of the software, which was the problem in the 00's.
A LOT of innovation was happening between 2005 and 2010 with hadoop and then docker, as well as in the management space with stuff like puppet/salt/chef/ansible. People had finally started to really get into the idea of scaling by virtualization but the big players were not offering any answers on how to make it actually work end to end.
I try not to but people keep sending me jira tickets.
I have two hours of thirty minute standups every day due to people trying to force agile on infrastructure and operations.