Search This Blog

The Software Industry Grew Up While We Weren't Looking

This week I'll continue expanding on things I brought up in my first post by taking a deeper look at how the software industry has matured over the last decade or so. I noted that the software industry has come of age, in a way. When I say that, I am actually thinking about it a few different ways.

In some of the software blogs I've read, especially posts around the 2003-2005 time-frame, I got the distinct impression that many software developers believed that the practice of software design and development was immature. Developers didn't want to be called engineers because they didn't see software development as anything like other forms of engineering, they didn't think developers were professional in the same way as other engineers, or they didn't think the tools, processes, and best practices were nearly as developed as in other engineering fields.

This is by no means a complete list. I've seen countless other reasons for this apologetic or dismissive attitude, but let's focus on these three views and not overwhelm ourselves. I would say these views have either diminished in prevalence or improved significantly, and in doing so, the software industry has been growing up. Let's take each of the in turn, shall we?

So is software engineering like other forms of engineering? Of course it's not. Is nuclear engineering like mechanical engineering? Not really. Is aerospace engineering like chemical engineering? Um... nope. Is electrical engineering like civil engineering? You get the idea. These are all different engineering disciplines because they are different. They do have one thing in common, though - the definition of engineering:
Engineering is the application of scientificeconomic, social, and practical knowledge, in order to design, build, and maintain structures, machines, devices, systems, materials and processes.
In the case of software engineering, the pertinent parts of the definition would mostly be social and practical knowledge applied to systems and processes, but it definitely applies.

It seems that what had software developers rejecting the engineering label was partly a feeling of awe in creating their own rules. How could software design be engineering when you can just define the rules to be a certain way and then design to those rules. If you don't like the rules, or they don't work, you can change them. Other fields of engineering are limited by physics. How unfortunate, and not at all like designing software, went the line of thought. But software is limited by physics - specifically time and space in the form of execution speed and memory or hard disk space. Those are hard (although improving) constraints that affect software scalability all the time.

Beyond the feeling that physics did not apply, there was a feeling of building something completely new and different with each piece of software that made it seem like domain knowledge gained on one project was not transferable to the next project. That characteristic of software design seemed to set it apart from engineering. So let's take building bridges as a counter-example. If a civil engineer is always building bridges over roads, the skills she has developed are probably pretty applicable from one project to the next. If she goes from bridges over roads to bridges over rivers or canyons or even freeway interchanges, the constraints change dramatically, and so does her ability to apply what she's already learned. What happens if she moves on to designing hydroelectric dams? That's a completely different kind of design problem. Will she need to learn new things? Of course! It's not so different from moving to a new domain in software development.

Why did software developers think their profession was somehow different at a fundamental level? I'm not sure. It might have had something to do with the perspective of "What you do is easy, what I do is hard." I think the software industry has outgrown that way of thinking. Developers have realized how much software design is like engineering and accepted the engineering label because that is, in fact, what they do. They've moved on to focus on solving the pressing problems before them.

That brings us to the second reason software developers didn't want to be called engineers. They didn't feel like professional engineers. With regards to the title of Professional Engineer, most engineers are not. Let's get that out of the way right away. That title mostly applies to civil engineers and architects. Other engineers can take the test and get the certificate to make it official, but it's not required in general.

It almost seems like software developers didn't want to be called engineers because they wanted to avoid a perceived sense of responsibility or because of a fear that becoming engineers would take the fun out of software development. The rise of large software companies like Google, Amazon, and Facebook and the successes of the Open Source movement have gone a long way toward changing that perspective. Now you can design great software professionally while still having fun doing it, and it's all quite respectable. On top of that, these companies commonly call their software positions "software development engineer," or something similar. They've promoted the label, and it's become widely accepted in the industry.

That still leaves us with the issue of software engineering tools, processes, and best practices. The improvements gained in this area over the past decade have been extraordinary. We now have well defined processes that range from the more structured and controlled Rational Unified Process to the more flexible Agile Methods. Depending on project and customer requirements, the methodology can be made to order from a wide array of standard options. On top of that, best practices are available at your fingertips either in the form of books too numerous to list, or excellent websites such as stackoverflow.com.

And then there are the tools. The tools are where the maturation of software development really shines. Integrated development environments and debuggers have come incredibly far in a relatively short time frame, and we now have a plethora of options to choose from. Bug tracking and version control software has transformed the way teams plan and build software, making large projects so much more tractable and manageable than they were before, while supercharging developer productivity. Now distributed version control promises to drastically improve software code base management yet again.

These tools show the maturity of the software industry not only in their application to software engineering, but also in the fact that other engineering fields are adopting them to improve their development. While I was an electrical engineer at my previous company, we started using Bugzilla and Subversion on our projects, resulting in significant improvements in project management and quality. It turns out that bug tracking and version control have general applicability to basically any business or field of research and development. Tracking issues and change management are fundamental to achieving quality, and these tools that automate and streamline those processes were developed first for software engineering. Now they are spreading into all kinds of applications from medicine to supply chain management to customer service, and making substantial improvements wherever they are applied.

While thinking about other engineering fields accepting development methods from the software industry, it occurred to me that this was a defining characteristic of a maturing industry. The software industry was no longer just taking from and building on the other fields of engineering. Now other fields of engineering were building on the innovations coming out of software engineering. Those decade-old views on software development not being a form of engineering mostly disappeared because the software industry had grown up without us (well, me at least) noticing. Even though the software industry has matured in the last decade, it shows no signs of stagnating. New trails will continue to be blazed. Novel discoveries will continue to be made. The innovation we have seen has only just begun.

The Incredible Disappearing Hardware

In my first post I alluded to a number of issues but didn't go into detail on any of them. Over the next few posts, I'll expand on some of those issues. Today I'd like to look into my assertion that hardware is turning into software and explore the hardware/software dichotomy that exists in essentially every product we buy that has an electric pulse.

I'll start by making an equally surprising claim: hardware is becoming invisible. I know I'm not the first to say this. I have the distinct memory of hearing someone say it and thinking, "that's exactly what's happening," but unfortunately I can't remember where or when I heard it, or who said it. As it stands, I've been noticing that hardware is disappearing for a few years now, and every major new product announcement has only further convinced me that it's true.

So what does that even mean, hardware is becoming invisible? Of course it's not, you say. There's more electronics hardware than there's ever been. But in a physical sense, hardware has been shrinking in size. It's obvious that our phones, computers, and televisions are getting smaller, lighter, and thinner. Look at all the svelte new tablets and super-slim LED TVs compared to their predecessors of 5 years, or even 2 years ago. That trend has been going on for decades, though. Remember the bricks they passed off as cell phones in the '80s?

That's not what I mean, though. How about completely new types of products? Anything new cannot come out in a clunky, bulky initial version, or it  will be summarily passed over. It would have to be something incredibly novel, or it would be a joke, and rightly so. Beyond that, you would have to think of something that a smart phone or tablet can't already do with an app, and if there isn't an app, you can bet there will be soon. I have an app that can start my car's climate control to the temperature I want. No electric starter necessary. The car comes with that capability built in. I just had to download the app on my iPod and it talks to the car through the internets! That's just the tip of the iceberg for the iPod. There's a practically infinite number of things it can be, including a phone if I skype on a WiFi network. Oh, and it's a portable music library, too. I almost forgot about that. How many CDs does it take to hold 10,000 songs? How much hardware has this one little device made obsolete or irrelevant?

Okay, okay, let's push a little harder on this iPod idea, and lets throw in every other display device while we're at it because this is in no way Apple-specific. That means computers, laptops, ultrabooks, smart phones, tablets, and smart TVs (or any TV connected to a PS3/Xbox 360/Blu-ray player) are all part of this discussion. They are all becoming invisible. They are all turning into software. They are all just a display attached to the internet. Nobody cares what processor is in them as long as it's a clear, vibrant display that can show people what they want to see. The device doesn't even need to do what the user wants it to do directly because it can just be done in the cloud... with infinite upgrades to the software.

Increasingly, what you want to see and do with software can be done anywhere. You can be hardware, and even operating system, agnostic. All you need is an internet connection and you can get at your data and media from any device running any OS you please. You can even pick your cloud or use a variety. There's Dropbox, Google Drive, iCloud and Sky Drive just off the top of my head. They are your access, your backup, and your recovery of your data anywhere and all the time. Who needs physical backup media anymore? That hardware is disappearing.

I'm actually finding myself using my desktop computer less and less. The only thing I do on it anymore is edit photos and video because it's a focused activity where I don't mind sitting in front of a computer terminal for a while. Anything else I'd rather do on a different display in a different setting. There's no single device that's replacing the desktop computer, though. I play games and watch movies on my TV and PS3, I listen to music and do all kinds of little tasks on my iPod, I read books on my Kindle, I code (and now blog) on my laptop, and I want a Kindle Fire for general internet reading because I'm tired of using my laptop for that. The hardware doesn't matter; it's all software with a display that allows me to interact with it.

Here's another take on how hardware is becoming invisible. Remember that whole Intel-AMD battle being fought over who could build the fastest, most powerful processor? Who won? How about Google. Or maybe Facebook. No, really, does anyone even still care? Any of a number of processors could be in your devices, but in all likelihood, it doesn't matter to you. Somewhere during the race to maintain Moore's law, software decided, "This is good enough, thanks." Even for the more demanding applications, a 3 GHz dual- or quad-core processor and 4GB of memory seems to be plenty.

The processor performance race and the memory upgrade track have both kind of fizzled. Intel came out with a 3.8 GHz Pentium 4 in 2004. The architecture sucked, but let's ignore that. What's the fastest Intel processor you can get today? A 3.6 GHz Core i7-3820, and it's 9 years later. That should be 6 doublings, or 64 times the transistor count since the Pentium 4. I know the frequency couldn't keep increasing at that rate, but the frequency didn't increase at all. If it had, the Core i7 would probably be hotter than the surface of the sun. Where did all of those transistors go?! Not into extra cores because it's only a quad-core processor - that's four cores, not sixty-four. They mostly went into on-chip cache, integrated on-chip graphics, and extra features for software like hardware virtualization (think about that for a second). You might make the argument that most software can't take advantage of 64 cores, and that's probably right, at the moment. But I'm saying that software doesn't even need to take advantage of more cores. Most software doesn't need faster hardware, so hardware is becoming irrelevant because anything available is good enough for all practical purposes.

Memory is on pretty much the same stalled path. In 2008 a low-end system had about 4GB of RAM and you could load up a high-end system with 16GB of RAM. I just looked at Newegg.com, and memory in pre-configured desktops and laptops ranges from, wait for it... 1GB to 16GB. Hmm, just out of curiosity, I checked my order history, and in 2007 I bought 2x2GB of DDR2-800 SDRAM for $100. In 2010 I bought 2x2GB of DDR3-1600 SDRAM for $110. That would be the same amount, albeit faster, memory for slightly more money. Today you can get exactly the same memory for $30, and it's still considered a decent amount of memory. However, if memory had actually continued doubling every 18 months, I should be able to get 8 times 16GB, or 128GB of memory today. Well, I could get 64GB, but it would have to be in the form of 8 DIMMs and not many motherboards support that. Besides, what counts is the amount of memory jammed onto a single DIMM, and that certainly hasn't doubled three times since 2008.

I would say memory hasn't continued to grow exponentially because it no longer needs to. There are very few consumer applications that use more than a gigabyte, and the people that do use those memory-hungry applications aren't going to see significant performance improvements over 16GB. Software has stopped expanding in size to fill any memory available. Instead, software is expanding in kind to fill any desire available. The hardware is no longer the driver for solving new and interesting problems. You don't have to wait for hardware to advance enough to solve hard problems. Now the driver is the software, and if you can come up with better software to solve peoples' problems more easily or elegantly, you don't have to wait. The hardware is already there, it's already capable, and it's already virtualized and abstracted so you can ignore most, if not all, of the hardware details.

Let's look at another area where software is replacing hardware. Engineers have made tremendous advances in DC/DC converters lately, and those advances are enabling incredible efficiency gains in DC motors and LED lighting. Inside these new converter chips is a software control loop that reacts intelligently to the available input power and desired output power to extract maximum efficiency from the power source, usually a battery. The software is making these advances possible, and they are coming at a faster rate every year as people discover novel improvements to the software algorithms that control the converters.

Now imagine what happens when all of those advances in DC motors and LED lighting get combined in a car? Well, you can take out the engine, the fuel line, the gas tank, the exhaust system, the engine coolant system, and the oil. Put in a big DC motor and a ginormous battery, and you've got a Nissan LEAF... or a Tesla Roadster (sweet, but expensive). More electric car models are coming this year, and they are going to keep on coming because they are the future. These cars would not even be possible if it wasn't for all the software that's packed in them to control the efficiency of the motor and lights, and especially to manage the charging, discharging, and regenerative braking for the battery.

I have a LEAF, and I am constantly impressed with the software in it. Remember that it can be climate-controlled from my iPod? I can also check the charging status over the internet, download new charging station locations that are mapped in the GPS navigation system, and get updates on all kinds of environmental information. But the really thought-provoking feature combination is that the car is internet connected and the battery/motor performance is software controlled. Think about that for a moment. I'm not sure it will happen, but if Nissan released an update, I could download a performance upgrade for my car. How's that for virtualizing hardware and making it more flexible through software? Hey, I can dream, can't I? It's now theoretically possible.

There are many more examples of how hardware is turning into software, but that's plenty to show where the idea came from and where it's headed. The implications for the future are truly mind-boggling. What hardware do you think will turn into software next?

Setting Some Ground Rules (For Myself)

I'm Sam Koblenski, and this year I've decided to write a blog. I've been kicking around this idea for some months now, and I figured the beginning of a new year would be as good a time as any to get started.

Why am I writing a blog? Well, last year I changed jobs for the first time in my career, and in so doing I learned a lot about what I haven't been doing to develop professionally. I had learned many new skills and advanced quite well in IC design and test, but at a fundamental level, I was becoming complacent in my job and letting my professional development languish.

When I moved to Hi-Techniques, I made a conscious decision to step out of hardware design and into software design, as I believe that more and more, software is where the edge of innovation is happening. Hardware is turning into software, so to speak, and software is growing and morphing at an increasingly rapid pace.

So I took a big step into software engineering. That is almost exclusively what I am doing at work now, but that is not all I am doing. If it were, it wouldn't be much in the way of professional development; it would just be learning new skills in another job setting. No, what I've really been doing over the past year to improve as a software engineer is reading an incredible amount of material about the world of software design and development, both offline, through books, and online, mostly through blogs

What I've discovered is an incredibly rich and diverse software culture that has seemingly sprung up out of nowhere since I was studying electrical and computer engineering and computer science in college. That was in the first part of the last decade, and yes, I know the software culture didn't start then. It's much older than that, but I was in college...in a fraternity...and nobody told me that big things were happening outside my little world. 

I knew technology was advancing at an ever faster pace, and I chose the degrees I did because I was fascinated by technology and wanted to engineer my own small piece of it. It never occurred to me that there was so much more to design than learning the principles and concepts from college courses. Working in a niche hardware market designing sensor interface chips didn't really encourage me to look around much, either. 

Well, I was young and dumb. Now I know and I'm working on it. But at the same time, as I read those blogs, I can't help but think that the software industry has been coming of age since the dot-com crash. If you weren't paying attention, as I wasn't, it was probably easy to miss. I have a lot of catching up to do.

So why am I writing a blog?!

1. I've Always Hated Writing

Yes, that's right. Ever since high school...middle school, really...I just couldn't stand writing my thoughts down. I stopped taking notes and I always saved writing assignments for last. I could always write a decent paper, and I never had much trouble completing one once I got started. It was just so incredibly boring! Now I know why. It was hardly ever an option to write about anything that interested me, the style was normally dictated by the teacher, and I was never moved to write. It was either assigned to me, or expected of me, and I never looked at my notes again anyway, so why write them?

Suddenly, I find myself quite driven to write. I'm reading a lot of great writing by other programmers, and it is making me want to improve my own writing. The only way to do that is to write. The best way to do it is to write publicly. So here we are.

2. I've Got Ideas Coming Out of My Ears

And they're making a mess on the floor. All of this reading and studying is generating so many ideas, and they are becoming a massive jumbled mesh in my head. I need a way to organize them and understand my thoughts better, and I've read quite a few blogs that recommend writing as a great way to do it. Making it public is even better since it's much more difficult to be lazy about completing thoughts and drawing conclusions when other people are reading what you write.

3. I Might Have Something to Contribute

At this point, I feel like I probably don't have anything to say that hasn't already been said. Even that sounds eerily familiar, like I've read it before. But maybe some of you won't have heard some of these things before. And just maybe, I might say something new or connect ideas that lead to novel insights. You never know, unless you try.

Now that that's settled, what exactly am I going to write about? The major focus will be grappling with my thoughts on software design and development, where the industry is headed, and how to improve as a software engineer. I'll also occasionally digress into economics, education, energy, and the environment mostly because I am intensely interested in these topics, but also because there are so many dependencies between them and the software industry. Of course, I reserve the right to talk about anything that strikes me since, you know, it's my blog.

Finally, we come to the question of how often you can expect these ravings from me. Writing a blog is like anything else that requires practice to get better. Discipline is important, otherwise life has a way of creeping in and stealing your time for other things. Besides, an untended blog isn't going to benefit anyone. I'm going to commit to posting once a week. I don't think I can do more because I have a wonderful wife and two young kids that take priority. I hope anyone reading this blog will hold me to that schedule, and I definitely welcome comments. I'll read every one and respond when I can. I'm excited to see what will become of this venture, and what we'll learn along the way.