Can you imagine that just writing code in an optimized way can save millions of dollars in computational costs for any organization on this planet?
It's way more fascinating whether you're a tech nerd or non-techie. But, yes, it can and it is all possible with one of the powerful concepts of modern computer science: DSA (Data Structures & Algorithms)
For any organization, its customers are the most important and no organization can compromise on the UX (Customer Experience) as it directly impacts the survival of any organization in the market.
As a software developer, the goal is never to change 5 + 2 = 7 but optimize it by decreasing any complexity. The speed and accuracy of the output are the key parameters that matter a lot to the best UX and they need to be maintained.
Would you love to use your favorite Instagram if it takes 10 seconds just to like a single post?
For maintenance, organizations can do one of two things: Invest more-n-more in Infrastructure (Resources) or let developers apply their most important skills "DSA". (Applying DSA just might not decrease the investment in Infrastructure, it depends on several more parameters, but it does save lots of costs)
DSA
When we have a problem in front of us, we follow solutions in form of an Algorithm. In any algorithm, there may have steps & every step need to be followed by one another / step-by-step.
Let's suppose, we have an idea of creating an Instagram post-like feature. If we have a product, we have codes (program files). The program files are always stored in some Storage Media, the "Hard Disk". When we run any program or execute any program file, it loads on Main Memory (RAM).
What we've done so far, we've already created a code for Instagram post-like feature and stored it in our Hard Disk, and when executing it, it loads on RAM, and most importantly, the instructions (Algo) we've written in our code, the CPU will be responsible for performing all the steps/operations.
Can we visualize it in this way?
Idea > Software > Code > Hard Disk > RAM > CPU
We had an idea of the Instagram post-like feature of the Software "Instagram", we'd written the code already and stored it in some folder on Hard Disk, and when we executed the program, it loaded on RAM, and all these written operations/steps (Algorithm) performed by CPU.
So our Instagram Post-like feature code takes 10 seconds to like just one post? Why so?
Because, the code we've written or by some other developer, is just not optimized, but it's working (Initially, any code is measured with one single metric: is it running?)
As developers and engineers gain experience they learn about the quality of code.
I'm going to hate Instagram app if it takes 10 seconds just in liking a single post, will not be a good UX for me.
There can be another optimized snippet of code doing the same thing (liking a post) but is better than the one we've written which is taking 10 critical seconds.
It's all a game of Data and how it is structured. Here the DSA things come.
Following one of my original LinkedIn posts, let's take one more example
If the "Add to Cart" button/feature takes 10 critical seconds to work and proceed to next, then as a regular customer of "Amazon", I will stop using it and possibly move to the next eCommerce like Flipkart.
In case, the average purchase I make on Amazon is $20, and 1 million similar users are having the same issue with that "Add to Cart" thing, we'll possibly quit the platform and it will cost Amazon $20 Million straight loss.
The point is not here about calculating how much it will cost Amazon and other eCommerce platforms.
But, the point is very clear how much it is important to build highly efficient apps expected to serve customers with speed and accurate output.
Stay tuned for next...