By: Phillip Sharpless

.NET & Business Intelligence Consultant @ Key2 Consulting

 

As an application developer, it’s been interesting to reflect on how development methodologies have adapted as technology rapidly evolves. The central focus of the app dev world has already shifted several times over in just the 15 years I’ve been a professional developer. One major direction that much of app development appears to be heading towards is the cloud. The cloud centric application model is one where the application processing and their data reside on the cloud, with all the resources available that apps could ever ask for. End users need not install these applications or upgrade them, or ever give them much thought at all outside of actually using them. Cloud application’s services are accessed via a device running a thin or thick client app, most often times a web browser or a native mobile app.

 

Cloud technology and applications, mobile device

 

Interestingly enough, this future shares a lot of similarities to the past.

The computing of the 1960s and ’70s was largely dominated by the mainframe. Computing resources (processing power, memory, storage…) were both very costly and physically large. Typically only the largest corporations, universities, or government agencies could afford to own and operate one. The necessity of allowing broader access to this limited resource resulted in a model that sounds familiar to the one just described.

The mainframe served as the central computing platform where the applications actually ran and where any actual data processing or storage occurred. The mainframe itself was setup and maintained by specialists. Access to the computing was provided through terminals – possibly just a keyboard and monitor – that had some sort of means of communication with the mainframe. Requests or jobs were submitted to the centralized computer. The actual work took place there and the output was sent back to the terminal. Applications were built with the idea in mind that they would be hosted, almost as a service, for multiple consumers. This model indeed bears some resemblance to the modern cloud, even if just superficially.

 

computer mainframe

 

The dawn of the PC era throughout the ’80s shifted focus away from this centralized computing model. It was now possible to build a computer small and affordable enough that people could actually have one in their own homes. These machines were not simple terminals but instead full computers in their own right. They had their own processor, memory, and storage immediately on hand. This freed the machines from the necessity of being tied to a central mainframe for work to be performed.

Application development quickly shifted to support this new landscape. Apps were now often developed to be installed and run locally on the physical device the user was interacting with. Given the lack of widespread connectivity, this was the only real model that could support computing being ubiquitous. These applications were of course bound in complexity and performance by the hardware resources available on the device running them. As such, PCs had to constantly get bigger and beefier. Faster processors, more memory, larger hard disks – the specs of the machines had to grow to allow the software running on them to also grow.

 

old computer

 

I started my professional career doing almost exclusively desktop application development on the Windows platform, beginning in the late ’90s working with MFC C++ to later working in .NET WinForms and WPF C# throughout the ‘2000s. Since it was the first way I learned to think of application development, it was the only way I thought of it initially. It took me some time to adjust philosophically! While the web boom had hit in the mid ’90s, it wasn’t until the Web 2.0 era of the mid ‘2000s that web application development began prompting a noticeable pivot back to the idea of hosted applications entirely out of the end user’s hands. For nearly the past decade now I have developed in the web app world, mostly in ASP.NET WebForms and MVC, as well as pure HTML/JavaScript apps.

After decades of desktop application development, many of the headaches developers had grown accustomed to dealing with seemed to go away in this new web centric model. No more having to support legacy versions of the application. No more struggling with performance-related issues due to resources which may or may not be available on the hardware (DLL hell, Minimum/Recommend specs, etc.) No more potentially maintaining several different code bases to support the application being available cross platform.

Of course this model also raised some new challenges as well, particularly security. Security concerns became paramount in a setup based around communication. Vulnerabilities or exploits can potentially bring down the application for the entire user base, or worse, allow for data to be corrupted or stolen – something that has indeed happened to numerous organizations over the years.

Another new consideration was managing the resources necessary to keep the hosted application up and running. Err on the side of not enough resources and you’ll be faced with sluggishness or outages. Err on the side of too many resources and you’re simply paying for horsepower you don’t need. Many fundamentally non-IT organizations have been forced into setting up and maintaining sophisticated IT infrastructures, typically a costly endeavor.

Many organizations are now looking to the cloud to solve these concerns as network capacity and virtualization technology continue to improve. Promises of all the advantages of hosted applications without the disadvantages of setting up and maintaining the infrastructure necessary to support them makes the transition highly appealing.

 

The hope is that organizations can once again focus on their core business, leaving the IT infrastructure to the experts. This is in addition to potentially saving costs by only utilizing the resources actually needed and improving overall reliability. As application developers we must once again readjust how we approach software development in regards to building cloud-centric applications. Major development considerations with the cloud in mind are making sure the software is properly architectured to take advantage of scalability, ensuring it is not tightly coupled with specific data sources, file systems, or implementations of certain technologies, and proactively considering the security surrounding every component’s interaction with each other.

The saying “there’s no constant in life except change” has proven very true in regards to application development. As technology has evolved, application development paradigms have adapted to fit whatever the current landscape is. Even as cloud computing promises to be a next big evolution, many of the conceptual aspects surrounding it are reminiscent of the past. Many of the considerations from the desktop app era persist as well, as client applications can still be quite sophisticated in their own right. It would seem that as app developers our mindsets should be as equally adaptive as technology has been, embracing the future while understanding the past.