- IBM Champion 2018
- The Man who knew Infinity – Srinivasa Ramanujan Iyengar
- Is your Enterprise adopting the right Cloud Service Model ?
- Is your Enterprise Data on Cloud
- Enterprise Modernization
- American Airlines – The Era of negative customer 360
- Design Thinking – The value it brings
- The Modernization Realm
- “Mirror Mirror on the wall, who is the fairest in the land?”
- 2009 – A Post Mortem
Another 22-Dec passes by without any pomp and show. This day, 129 years ago, Srinivasa Ramanujan, the mathematical genius from a small hamlet in India, Kumbakonam, was born. Raised in a small house at Sarangapani Sannidhi Street , overlooking the Vishnu temple at Sarangapani, Ramanujan became worlds renowned Mathematical genius and Fellow of the Royal Society from Cambridge, well, with absolutely no mathematical education to his credit.
He stumbled upon George Carrs “Synopsis of Elementary Results in Pure and Applied Mathematics” at school, which created a huge impact and resulted in his entry into the world of mathematics. Ramanujan was exultant after he found a way to write trigonometric functions in a way unrelated to right angled triangles, however when he found that the legendary mathematician Euler had proved this nearly 150 years back, he stashed away all his findings in the roof of his house. He could never reconcile nor accustomed to failure when it came to Mathematics.
Ramachandra Rao, then collector of Nellore and a mathematician of the highest honour himself, told about Ramanujan
“A short uncouth figure, stout, unshaven, not over clean, with one conspicuous feature-shining eyes- walked in with a frayed notebook under his arm. He was miserably poor. … He opened his book and began to explain some of his discoveries. I saw quite at once that there was something out of the way; but my knowledge did not permit me to judge whether he talked sense or nonsense. … I asked him what he wanted. He said he wanted a pittance to live on so that he might pursue his researches”
Ramanujan’s mathematics always led to infinity. His questions at the Indian Mathematical Society journal left many astounded and was unanswered for many months, until he provided the answer himself.
An ardent devotee of Namakkal goddess, he always related Mathematics to universe, space and god and felt his miracles were all the power of the goddess. At a time when crossing seas was prohibited in the Brahmin culture, he slept at Namakkal temple for three full days seeking an answer and it is said, the goddess appeared in his dream and gave him permission to travel to Cambridge.
One fine morning, Hardy opened the letter addressed to him, containing the mathematical works by Ramanujan. He kept thinking about the profound depth of mathematics this letter traversed, throughout the afternoon while playing cricket. Along with Littlewood, they inferred that such works can only be the art of a true genius. How true they were.
Ramanujan was making the traditional Tamilnadu dish, Upma, at his room in Trinity, when his close friend and batch mate at Trinity, PC Mahalanobis, (Who later founded the Indian Statistical Institute), came to his room throwing a very challenging question that came in the morning news paper. Within minutes, Ramanujan not only provided the answer to the question on Bi Variate functions, but came out with a generic equation that will answer any such questions ad infinitum.
Devout of vegetarian food options, engrossed into mathematics late into the night and no proper sleep, Ramanujan was diagnosed with Tuberculosis. When Hardy came to meet him at the hospital in a cab bearing the number 1729, which according to Hardy was not a great number and felt it as a bad omen, Ramanujan spontaneously replied that its a very good omen and its the smallest number which can be expressed as the cube of two different numbers in two different ways.
Nearing his death, his mother ran to a famous astrologer with Ramanujans horoscope only to find that – This was the horoscope of a person who will live long or become so famous and die young.
The tussle between Ramanujans wife and mother could be attributed to the sudden and untimlely death of Ramanujan, however with plethora of his works still remaining mysterious to young mathematicians, the legacy of Ramanujan lives on.
As more and more Organizations try to embrace cloud with varied degree of conversations and maturity, its extremely crucial to understand the best suited cloud service model, your Enterprise may need. Many a times Enterprises pick a Cloud service model before even verifying whether it addresses the right needs. For e.g. Organizations with .Net shop immediately default to Azure, without even considering the various workloads, the data needs, integration nuances, performance, throughput requirements and other parameters. PaaS is maturing and not all workloads might be tailor made for PaaS.
As we all know IaaS abstracts the underlying infrastructure layer, so that organizations no longer have to worry about hardware, power, cooling mechanisms, procurement of hardware and other infrastructure needs. PaaS goes a level higher and abstracts OS, DB, App Server and Programming Language. The consumer will have to take care of application and data and all other layers are abstracted in PaaS. SaaS is the ultimate level of abstraction and the entire application is delivered with the consumer’s focus only on administering users to the system.
If the service is not the organizations core competency, adopting SaaS serves as a good alternative as long as it meets the business requirements. However with SaaS, consumer has little control over the application, no real say in SLA’s, underlying architecture and other needs. However the advantage is, organizations can quickly be up and running with SaaS based solutions and consumers will not have to worry about patches, fixes, new version releases, support of devices, and other changing landscape.
If you want full control of your application stack from a performance or security perspective and if the application has high scalability and performance requirements, IaaS works best. The consumer gets the complete control over the underlying architecture (scalability, failover, etc). With more control, the more work we will have to do and the longer it takes to get to the market. PaaS is somewhere in between and meant for more rapid development / destruction workloads, rapid innovation workloads as the focus is only on building softwares and not on managing hardware, OS, DB, Programming stacks, etc. PaaS promises increased speed to the market but is the least mature of all the cloud service models.
My suggestion is for Organization’s to not settle for a favorite vendor and instead do a thorough analysis and adopt all three service models in some form or the other.
Do you echo my thoughts ?
Rob Watson, one of the best Environmental Minds, once famously said “We are either going to be losers or heroes – there’s no room for anyone for anything in between”. The context was more from a Green Revolution perspective, but its apt with your Legacy Data. Its a decision between keeping your legacy System of Record On Premise and get no insights out of it versus moving them into the Cloud and mix them with unstructured data for better insights (most importantly be a Hero too).
Data Movement has become so seamless and fast with the advent of various data connectors and its extremely critical organizations bring their core System of Record (Structured in nature) and mix with the Social media feeds, for better customer insights and be aware of their preferences.
1. Infosphere System Z Connector – One of the best tools available for System Z Resources (VSAM Db2, IMS) with efficiency and performance built in. It consumes minimal MIPS as it reads the binaries directly. This connector is very apt for large volumes of data.
2. Syncsort DMX-H – Is a Hadoop ETL solution which seamlessly shifts heavy workloads from Mainframe into the Hadoop eco-system.
3. z/OS Connect – This reads any of your System Z data source and exposes them as a REST based API. This connector uses WOLA (Websphere Optimized Local Adaptor) at its backend, for bi-directional communication, and is suitable for Mobile based interfaces.
4. Sqoop – Meant only for Relational Data sources as its more SQL based. The lack of support in providing “WITH UR” in the statements, creates Data Locking issues. Its not meant for large volumes of data.
5.Rocket mainframe data service – Available on IBM Bluemix, this provides an easy way to leverage System Z Data for cloud and mobile services. Developers can use MongoDB or SQL based connectivity to access System Z Data.
6. Big SQL and Spark
Enterprises should have a crystallized view on the target platform, volumes, performance aspects and end user architecture before closing in on the options above.
As they say in Texas, “If all you ever do is all you’ve ever done, then all you’ll ever get is all you ever got”. So why keep your data at bay, when it can be bought into Cloud and made more meaningful.
What are your experiences ?
Enterprises have continued to enhance and enrich mainframe applications over the years with additional features and functionalities, with most systems in their current form bear little or no semblance to their original functionality and size. The original authors of such applications have either retired or on the verge of retirement.
The burgeoning growth of mainframe applications and dwindling mainframe skills adversely impact organizations’ agility to respond to business changes, while the increased complexity and loss of SME knowledge results in higher costs of maintenance. It is imperative to address the existing challenges and ensure the next generation programmers continue to design and develop fit-for-purpose workloads on mainframe technologies. We see increased demand across the industry in the areas of relational data management systems, business rules externalization, exploitation of latest technologies made available on the current generation mainframe platform.
To address the challenges for Enterprises with Legacy Modernization environments, its critical to evaluate the application against three dimensions – business alignment, technology currency and architecture fitment – and their sub-parameters.
Enterprises need to adopt one or a combination of the following approaches as per business capability envisaged in the long term
• Optimization – Deals with technology consumption reduction on mainframe platform itself. Candidates include MIPS optimization, batch window reduction, offloading to zIIP/IFL specialty processors, software license rationalization, exit unsupported 3rd party software, leveraging latest features of the Software and revisiting the application program.
• Migration – Port applications off mainframe to Linux/Unix/Windows (L-U-W) platforms with no additional business capability, carrying forward the source application design and modularity. Uses language compilers and runtime environments from Micro Focus, Dell Clerity or Oracle that mimic the source environment. Migration helps address the skill scarcity and allows preparing for future enhancements and modifications.
• Modernization – Involves giving facelift to the applications such as web enablement of presentation tier, componentizing the existing applications for better maintainability (equal to separation of business logic and data access), and rationalization of batch processes, Potential re-write in object oriented languages on mainframe platform itself, with additional functionality – for e.g. COBOL to Java on System Z, REST Based Services, z/OS Connect, Service Componentization, Data Sources to Hadoop, etc.
• Re-engineering – Rebuild applications on L-U-W platforms by extracting system functionality and business rules from the existing applications, and addition use cases from business. This is a Two-stage approach: ‘Reverse Engineering’ for detailed system overview and requirement documents. ‘Forward Engineering’ to develop applications in Java or .NET to meet functional and non-functional requirements.
Is your enterprise getting modernized? What methodologies and approach towards Modernization are you adopting ?
In the era of Customer 360, American Airlines have proved that Customer -(360) (Negative) is important too. I was booked on an AA flight from San Antonio to Newark via Charlotte with over an hour overlay at Charlotte. My bad, the flight from San Antonio, was delayed due to a mechanical failure and as a result of which we missed the connection at Charlotte.
We were all given a 1-800 number to plan for an alternative. The customer service representative was rude and was in a hurry and was definitely not in the mood to help me. I had to basically give him all alternate options (Philaelphia, JFK, LGA) as the customer service representative lacked the basics of geography. Finally the representative put me on a flight the next day morning from Charlotte
We reached Charlotte late, I missed my connection and there was a long wait for the hotel voucher. With nearly 50 odd passengers waiting, AA just had 10 hotel vouchers for a hotel nearly 20 miles away. It was 1 AM and one hour into the queue, I was still not called and AA was taking its own sweet time to deal with passengers one by one. Finally when I was called, the voucher was over and i was told to head to any hotel and reimburse at their website a sum of 75 USD. Who on earth gives hotel for 75 USD near the airport ?
Not only did i miss my meeting the next morning, i was in for a monetary loss, with no compensation whatsoever.
This is what i would term as Customer Negative 360.
We are in an era where social media gets integrated to your core System of Record to get actionable insights about customer preferences and we cannot afford to have a Negative 360 view of the customer coming in at any cost.
What are your thoughts ?
I have been working alongside various customers the last one year, facilitating discussions in architecture and solution design, primarily around IBM Cloud (Bluemix PaaS and Softlayer IaaS). Starting from a point of view to architectural implementations and cloud assessment, customers have varying degree of thought process and stages towards implementing cloud.
A significant step towards cloud journey is the feasibility of the use case and importance to the customer. Not many customers have a clear strategy laid out in terms of cloud adoption and that’s where i feel Design Thinking becomes essential.
Starting with a high level overview on the customer pain areas, Design Thinking has the ability to drill this down to finer level design element, which can be made available as a prototype or production ready in less than a month. This Minimum viable product will have to be created in a minimum amount of time.
We should entice customers for a design thinking workshop and ensure that a product thus created gets implemented.
At this juncture its significant to know where Bluemix scores well.
- I’ve seen many customers coming out of the traditional Monolithic architecture to a more lean and simplified Microservices architecture and participate in an API economy and API Management in Bluemix aids in create, manage and enforcing API’s.
- IOT as a theme is fundamentally strong in Bluemix. The way a device can be added seamlessly to Bluemix, the way emitted data from sensors are transmitted via MQTT Protocol makes IOT seamless.
- I’m not a great advocate of Bluemix Mobile First service, however Mobile from an Application and Platform tier where Push notification, storing mobile app data in Cloudant can be performed, App Lifecycle tier where vulnerabilities can be checked via appscan, improving mobile experience using Quality Assurance and finally an Insights tier where Mobile app content can be managed and delivered makes Mobile a right and filtting use case for Bluemix.
- Mainframe Integration and Hybrid Integration
What are your Bluemix experiences ?