The time it takes to complete tasks is very important. The advantages are apparent; we save time and capital and we can often gain an edge over our competitors.
Computers and software are so useful for many reasons and the advantage stated above is one of the cardinal reasons. When we talk of speed so many things come to mind. Some may think of light, which travels at 300,000 km per second. For them it is the fastest element in our galaxy.
At this time the world’s fastest man is Usain Bolt who completed 100 meters in 9.58 seconds; of course not by car but on foot just like how the ancient Greeks did during their olympic games. The fastest woman ever is still Florence Griffith-Joyner populary known as “flo jo”.
In computers, we often need to enter data so typing speed does count. Barbara Blackburn still holds the record for the English language; she typed at 212 words per minute. That is phenomenal considering the fact that the average person types at only 36 words per minute.
Speed has many synonyms. In physics we talk of velocity. In economics we also talk of the velocity of money supply as popularly defined by the American economist Irvin Fischer.
In software development when we talk of overhead we are talking of the loss in performance. So we are either talking about the speed of software or the size of the software. For many years computer programmers or software developers had to choose between the two because memory was expensive and small and processers were not that fast. Today cell phones have greater processing power and more memory than personal computers that existed as far back as 20 years ago and those computers came with a price tag of over $2,000 estado unidensi. In learning history adverts are so useful. A 1995 PC advert talked about conserving cash and splurging power for only $1,999. The PC looked rather bulky compared to contemporary ones. Today China’s Tianhe-2 is the world’s fastest super computer boasting a speed of 33.86 petaflops (a petaflop is one quadrillion floating point calculations per second).To the novice in computers overhead might seem completely new and rather esoteric. I want you to know that it isn’t anything difficult and you can understand it easily.
Today is performance or overhead still an issue?
Some have argued that the only thing that does really matter is whether the software answers these questions.
- Does your software do what it’s required to do?
- Does your software help users do their job quicker and easier?
Some say that real programmers develop software in assembly language, C and C++. I want you to know that as the Founder and Sole Proprietor of Boachsoft I know what they are talking about and I totally agree with them. Back in 1996 it was reported that Bill Gates had an argument with the software developers at Microsoft over their choice of C++ over assembly language. The issue at stake was performance. Bill Gates talked of the overhead; the performance loss in using anything but assembly language. The developers talked about the advantages of c++, abstraction and so on.
Today we consider C++ far better than the other programming languages out there when it comes to performance and overhead.
Many say that computer programs out there do what they are required to do within a reasonable amount of time. They argue that most computer programs lie idle for so long whilst waiting for user input or command. So there is no need for anybody to talk about performance. Java, javascript, php, .net and other interpreted languages are all they need. They also mention Moore’s law which states that approximately every two years processing power doubles to buttress their point. They don’t miss any opportunity to talk about the programmer productivity garnered from using such programming languages coupled with increased profits. Profits and increased productivity are also important but looks like performance is being swept under the carpet. Overhead in performance can also lead to reduced profits.
While this may be true, in many instances there are computer programs for which performance is desirable. Today just about half of the apps out there run on tablets and cell phones. To cope with the much needed performance processor manufacturers have been increasing processing power. This shortens battery lifespan. Therefore one benefit that has been stated in better engineered software with little overhead is longer battery life span. I am sure you get irritated when you charge your phone or tablet only for it to run out in no time. Especially when you are in town and you have no way of charging your phone. Better developed software would lead to less power consumption by processors.
There are many other programs or applications that would benefit from an increase in processing power today. A good category is graphics intensive software like games, videos and 3D rendering software including 3D printers and flight or engineering simulation. Another set are applications for basic scientific computations such as applied non-linear programming. Multimedia is another area. Have you tried running so many processor intensive applications on a laptop? You soon discover the latency. The operating system might even crash. The blue screen.
Servers would benefit a lot from an increase in performance. With over a million connections per second, server applications that serve requests in microseconds would lead to an increase in performance greatly. So would enterprise level databases.
So what really determines the performance of computer applications? The determinants can be broadly divided into two. The hardware part and then the software part. Over the years as stated earlier we have seen a great leap in processing power, input – output and network connectivity and I am sure this would continue.
Assuming that there is an overhead of let say a factor of 100 in the operating system, an overhead of let say a factor of 100 in the interpreter or runtime, an overhead of a factor of 100 in the end user software and an overheard of a factor of 100 in the database then what would be the total overhead in a loop that executes let say a million times. The total overhead:Today I would be dealing with performance and overhead in software development.
Total Overhead = 100 * 100 * 100 * 100 * 1,000,000
That comes to an overhead of a factor of 100 trillion or 10^14.
The ramifications are indeed far reaching. Needless to say that performance and overhead still need to be taken into consideration when developing software.
Are compilers today taking advantage of modern processor instructions? Should we leave the development of assemblers or compilers to microprocessor manufacturers? I call for a debate on this issue. So assembly language isn’t dead yet. Assembly language ought to be given the emphasis it was once given. There are many areas where the use of assembly language in development would lead to less overhead and greater optimization. Operating system developers need to reduce the overhead, so must the developers of compilers, interpreters and runtimes. Optimization ought to be given a greater priority in these areas because they have a far reaching impact on the performance of end user applications.
The argument for the use of runtimes and compilers is for cross-platform compatibility. Simply put there are so many different microprocessors and they all have their instruction sets so runtimes allow applications to run across many platforms. This is true and the use of runtimes solves the problem to a certain extent. We ought to aim more at international standardization in microprocessor instruction set. This may be difficult taking into consideration business practices and patents. Processors able to understand and process a common instruction set would reduce the need for cross-platform runtimes. Software engineers would then be able to optimize code at the processor level. I must admit though that we are far off from such a standard.
So in summary, I would say speed still matters. The use of assembly language and better coding skills would lead to reduced overhead and an improvement in performance. Hopefully, in the next few years coupled with processor or hardware improvement, we should see an improvement in computer programming and software engineering. This should lead to faster and efficient software and less power consumption and also longer battery life.
We would have shorter response times, better throughput for a piece of work, low utilization of computing resources, faster data compression and decompression which would lead to faster network connectivity. All this means increased productivity, more profits and lastly better governance.
About the author:
Yaw Boakye-Yiadom is the founder and sole proprietor of Boachsoft– a global software company committed to excellence.
Boachsoft also makes an excellent Landlord software.