I’m on a bit of a rant here. Seriously, what is the difference between an Application Developer vs Software Developer vs Software Engineer? And then, of course, there is the now old-fashioned concept of a Programmer.
According to a reference from a Google search a “Software Engineer is a professional who applies the principles of software engineering for designing, development, maintenance, testing, and evaluation of computer software whereas Software Developer is a professional who builds software which runs across various types of computer.” Yup?
And here’s another – evidently “Application developers create mobile and computer apps and software programmers create, test programs or systems and fix bugs if and when they encounter them.” Oh wow! Talk about creating complexity through over-simplification.
In a recent article on skills shortages it lists “Professionals most sought after include IT application developers (11%), data analysts (10%), data scientists (9%), software developers (9%) and software engineers (8%).” Translated, to me, this is data analysts (10%), data scientists (9%), and application developers / software developers / software engineers / programmers (28%).
What do I believe, as an ex Programmer? I believe that, from a skills perspective, they are all the same, and trying to differentiate just obfuscates the criticality of this vital role in the digital and IT world today.
The fact is they all need the same competencies. And please don’t confuse a competency with a technology. Writing program code is a competency, Python is a technology. The competency of Programming / Application Development / Software Development / Software Engineer is:
Design, code and test programs that deliver the intended results in the intended technology environment or environments.
In today’s world, especially in the agile world, this is what a Programmer / Application Developer / Software Developer / Software Engineer is expected to do. The intended results are dependent on what is defined by the respective Use Case, and the intended technology environment could be front-end, back-end, web, mobile, AI, machine learning, embedded, database, etc. etc. And, of course, there are a plethora of technologies within each of those technology environments.
Technologies and technology stacks may change, but the competency, does not change. At lower levels of complexity, an incumbent may do a bit less, or work on less complex programs / applications.
When we understand this, it makes it so much easier for us to source the skills. How long does it take to train someone to an acceptable level of competence in the competency as defined above? The answer is probably 6 months to a year. How long does it take, once proficient in one technology, to learn another technology? Probably 1 to 2 months, depending on the technology shift.
Why is it then that we find it so difficult to source a Python Developer, when moving from PHP, or C++, for example, to Python only takes 1 to 2 months?
For us to reduce the “skills shortages”, especially at the higher levels of proficiency and experience, we need to multi-skill in terms of the technologies.
This benefits the employer organisation by creating more flexibility in terms of where and how people can be used in the organisation. And it benefits the employee by broadening their skills base and their earning potential.
This has the added benefit that it “frees up” job opportunities at the lower levels of proficiency and experience in the organisation, making room for new entrants.
It’s a win-win-win, and a win for the industry as a whole.
Complete the form below if you would like assistance in defining your IT and digital roles and competencies, and creating a career framework – the organisation’s most strategic structure today.