7 Technological trends on the Mirabeau Summer 2015 TechRadar

Auteur
Heini Withagen
Datum

Mirabeau loves technology. We eat, sleep and breathe it. And twice a year, our sharpest minds come together to discuss developments in technology and tooling. Those sessions culminate in an up-to-date TechRadar, in which we determine which technologies we’re going to employ, which need further investigation, and which are out-dated.

What’s on Mirabeau’s Summer 2015 TechRadar? We’ve assembled seven of the hottest topics, just for you:

  1. Front-end: Angular.js is a powerful script for developing complex web applications.
  2. Quick releases: Continuous Delivery ensures automated, timely software delivery.
  3. Multi-channel development: A Native Development Program is necessary for the best user experience.
  4. Internet of Things (IoT): Will TCP/IP significantly accelerate the development of new applications in the IoT field?
  5. Building program structure: Functional Programming is becoming more mature, and is emphasizing the need for multi-lingual programmers.
  6. Microservices and Containerization: Building apps the better way - one code base at a time.
  7. Data Scientists: The idols of the future.

Front-end: Angular.js is a powerful script for developing complex web applications

Front-End technologies and tooling around it are rapidly becoming mature. Front-End Engineers have learnt a lot from regular Software Engineering and standard software patterns are being applied to the development of complex browser-based applications. Make sure team-members working on front- and back-end are closely working together. The team-velocity will greatly benefit from their combined efforts.

While JQuery is still being used in tons of platforms, we see a reduced necessity to use JQuery. In previous browser-versions, an abstraction layer helped a lot to get functionality correctly working across multiple browsers. However, in the latest versions of many browser more and more features are directly supported. Therefore, JQuery is on-hold unless there is a very explicit reason to use it.

Angular.js is getting a lot of traction. Partly hype; partly because it is powerful. We see great use for Angular in the development of complex web-applications (e.g. personal banking, complex multi-step searches for airline tickets, etc.) but less for regular web-environments with lots of content. Using Angular will bring you the extra challenge of keeping your site accessible for Google. Use Angular with caution for the right type of project.

Javascript tooling ending on .io are popping up everywhere. Lots of people are putting energy into this field and it’s worth watching out for new tooling.

Quick releases: Continuous Delivery ensures automated, timely software delivery.

Continuous Delivery (CD) and Continuous Integration (CI) are closely related. While CI focusses on reliably developing functionality within a team, CD adds the capability to swiftly release any new features to a live environment. This is the Holy Grail for many development teams and business owners. More and more tooling becomes available to support Continuous Delivery. In summary, the key point is: Automate Everything. The process of automating will provide a single team-member as well as the whole team with the confidence that any new additions/changes to the platform will function correctly and that existing functionality will keep working as before.

Programmatically building up environments though Cloud-providers greatly enhances Continuous Delivery. Environments can be easily and quickly deployed and deleted (to save costs). As a consequence, system administrators will need coding skills and system configurations are now be versioned as regular code.
The concept of Immutable Infrastructure and a release process that makes use of this concept changes the way standard deploys have been done for many years. A new release is equivalent to deploying a completely new environment – including infrastructure - in parallel to the old one (through automated scripts) and switching users (gracefully) to the new environment. Also a great set up for AB-testing.

We see a lot of innovation in architecture that helps the development of continuous delivery. We expect this architecture innovation to continue with concepts like Immutable Infrastructure, Software defined architectures and containerization techniques like Docker.

Multi-channel development: A Native Development Program is necessary for the best user experience.

With ever increasing percentages of their user base, mobile first approaches is now common sense for most businesses. However, with the availability and adoption of other technologies around mobile (e.g. iBeacons), eBusiness teams more and more realize that a multi-channel or multi-touchpoint approach is the real goal they should be aiming for.

Apple updated its iOS and alongside introduced their own programming language Swift. Swift will increase the productivity of mobile developers but it also shows us that introducing a new programming language is not something to take on lightly. Apple has surprised many developers with non-backward compatible feature-changes in the first minor versions of Swift. Be aware that future updates to Swift might lead to extra releases of iOS-apps.

In recent years, several tools set (like Kony, Xamarin, etc.) have tried to make life easier for mobile developers working on apps running on both iOS, Android as well as Windows Phone. And they succeeded to some extent. The conclusion is: for the ultimate user experience native development per platform is an absolute necessity. We believe for most high-end B2C-applications this will be needed. For many B2B and B2E (Business to Employee) applications tools like Xamarin will be able to boost productivity of mobile teams.

With the introduction of the Apple Watch, the hype around wearables has reached its peak. During the next year or so, we will see if they can live up to their promise from a user perspective. From a technology perspective, screen resolution, touch resolution and battery life have proven to be acceptable for daily usage.

Internet of Things (IoT): Will TCP/IP significantly accelerate the development of new applications in the IoT field?

A few years ago, the first refrigerator with Internet-connectivity was a novelty. Now, new devices with all sorts of interconnectivity are being announced on a daily basis. However, the standard for interconnecting vast amounts of devices (several hundreds in e.g. a home or office) has not yet emerged. Zigbee and Z-wave are popular in home automation and a large array of devices (switches, sensors, etc.) are available from different vendors. However, their communication protocols are mostly proprietary and require deep technical skills to operate and develop applications for. WiFi and TCP/IP are gaining popularity in this field because of the wide-spread knowledge. But they come with a huge drawback: they require large amounts of power to operate. Especially for battery operated sensors this poses a challenge. However, we believe TCP/IP has a good chance of becoming the default protocol for the Internet of Things (IoT). It will greatly accelerate the development of new applications for this field as large number of developers are available with ready-to-use knowledge.

Security has been largely untouched in most IoT-setups. As more devices (from different vendors) get added to a network, this will increasingly become a risk. Especially when life-dependent devices (such as respiratory systems and real-time blood measurements) get connected.

Building program structure: Functional Programming is becoming more mature, and is emphasizing the need for multi-lingual programmers.

Keeping grip on all possible conditions and states in a modern software application has become increasingly difficult. Especially when concurrent execution of an application (like with web-applications) comes into play. Better tooling on writing/generating code, developing test- and load-scripts help the modern-day software engineer significantly but the search for better (read: more productive) programming languages is more alive than ever.

We see a renewed interest in Functional programming (FP). This declarative programming paradigm dates back to the 1930s when it was used to investigate function definitions and recursion. FP’s concept of pure functions (when a function is called twice with the same arguments, the same result must be returned) can simplify the design of an application and will guard the software engineer against unexpected behaviour within functions or objects (like with imperative programming where function-calls can result in changes of state). FP-languages are becoming more and more mature and we see real-world applications being developed, e.g. in Scala.

Furthermore, we see FP languages focusing on very specific application areas. For example, the R-language has become very popular for different forms of data-manipulation. Microsoft’s Azure Machine Learning uses R extensively to process large amounts of data.

The rise of FP languages stresses the need for polyglot software engineers; professionals who master several different programming languages and who are willing to invest in learning a new language when needed. The ability to mix-and-match different languages in one project will greatly increase the value of an engineer. They will be an asset to any project team.

Microservices and Containerization: Building apps the better way - one code base at a time.

The concept of Containerization stems from the shipping industry where a highly standardized shape and dimension for a shipping unit has propelled the way we transport goods around the globe. Software deployment (transporting and installing applications on different environments) is still very much a ‘bulk’-process where many modifications need to be made to the application to get it up-and-running on a new environment. Software containers are hip and happening. They try to solve the complicated and error-prone process of moving applications from one environment to the other by providing common tools to build, ship and run distribute applications. Many start-ups in this field are active, all with their own idea on how to solve this. Just recently the Open Container Project tries to unite the many views on this and establish a set of common standard around software container technology. The winner in this field is still not determined although many believe Docker has a good chance of becoming the defacto standard (they’ll probably get bought by one of the big vendors….).

The concept of Microservices drives the need for containerization even further. In this software architecture style, complex applications can be composed of small, independent processes communicating through APIs. The programming language of these small services should be irrelevant and each of these services should focus on a small task. This could result in large-scale service architectures having complex dependencies between the services (versioning, performance, etc.). New tools like Dutch-based Vamp.io help DevOps-teams with running complex and mission critical microservices based architectures.

Data Scientists: The idols of the future.

Data Scientists are hot! New university programs in the field of Data Science are starting everywhere. The promise of Big Data has awoken something in many boardrooms. Traditional data analysis tooling (like SPSS) are getting renewed attention but new players are entering the field as well. However, many of these new players focus on just part of the field, e.g. Tableau is mainly focused on data visualization and will help you very little on detailed in analysis.

Microsoft Azure Machine Learning brings seemingly endless cheap computational power to the masses. But as always with powerful tooling: you need to know what you are doing. Deep knowledge on data modelling, hypotheses, test-sets, algorithms, etc. is required to gain knowledge out of bulks of data. We believe Data Scientists who are able to make use of the computational power available will the Rock Stars of the near future.

Tags