Functional Progamming and BigData

Map and Reduce have become buzzwords for bigdata processing, although they are not new concepts to computer scientists. Ever since the invention of functional languages, map and reduce have been the delight of computer scientists. The problem is big data stopped at incorporating only these two concepts from functional languages while ignoring several other interesting ones like first-class functions, filter, recursion etc. Some of these can be easily incorporated into bigdata processing techniques.

A nice way to deal with this is by building a parallel interpreter of functional languages. That will help building of parallel algorithms very straight forward. A classic example is using first class functions for building symbolic logic, optimization etc.

Machine Learning and Lambda Calculus

Most of human learning happens through symbols. We do not remember data in quantitative fashion. Even when we remember numbers (like for example phone number or value of Pi) we store it as a series of symbols rather than as float/in values. Our arithmetic calculations are also symbol based. This symbolic representation gives us the power of abstraction. If we want machines to emulate humans, machines should also understand symbols. To some extant this already happens when a variable is a given a name and it is referred by that name in subsequent code. But in this case the machine is not learning that symbol, rather the programmer is in a way hardcoding the symbol. If a machine can truly learn symbols, their associations to combine symbols to form complex symbols and form abstractions, it will be closer to humans in learning ability. In a way digital machines use 0’s and 1’s as symbols at the very base level and create abstractions around them already.

The need for performing operations on symbols has led to the development of Lambda calculus and the language group of Lisp. This was the first major step in AI. Although this happened more than 50 years ago, this approach towards AI has not been given as much importance. The computational world got lost in other aspects like data processing, black-box model fitting (including ANN). There needs to be a revival or symbolic manipulation and lambda calculus for AI to truly progress beyond function fitting.

Machine Learning vs. Machine Intelligence

Is Artificial Intelligence just about beating a person in the game of Go? I guess not, people have been playing that game for a long time in addition to doing several other things at the same time, including learning new things. To be able to beat a person at a game is not a lofty goal to achieve. Anyways, machines have been doing a lot of work not possible by humans already. They are becoming more intelligent over time though.

So going back to the question of Go.. Did the machine learn on its own? the clear answer is no. It was just following an algorithm programmed by its creators. Does that make it non-intelligent. Again the answer is no. Just because the machine didn’t learn doesn’t mean it is not intelligent. A lot of intelligence comes from programmers inputting specific algorithms. Those algorithms sometimes update themselves leading to “learning”. Nevertheless the intelligence comes from the algorithms/programs, whether they learn or not.

A machine able to predict an event is intelligent because of the prediction algorithms. It may not be able to learn to predict new types of events but it still is useful. In a way learning is only a part of intelligence. Intelligence converting knowledge/data into actionable information. Whereas machine learning would just focus on classifying or predicting. Machine intelligence when applied to business can create much bigger impact than pure machine learning. To start with the intelligence has to be fed in by experts.

Cloud is a bigger revolution than mobile

Much of the boom in the tech space is attributed to rapid increase in connectivity through mobile. That is a tremendous underestimation of what is going on right now in the tech world. Most of the ease if setting up and running businesses is because of the evolution of cloud architecture and not just because a user is able to access an app on phone.

Although mobile connectivity is serving the last mile, a significant amount of processing happens before the service reaches to the end user. And all of this has become seamless because of the cheap availability of compute, storage and bandwidth on the cloud. This much ignored fact leads to some misjudgments on several business models. Firstly if a business is serving only the end user without fully leveraging the power of cloud will never be efficient and will lose out to the competition soon enough. On the other hand, businesses purely focused on delivering cloud based services either directly to the users or to other applications will be definitely adding real value to the ecosystem.

In other words for any tech oriented company: “if you are not on cloud, you are dead”. Being on the cloud does not mean putting up a server on AWS or Azure. It involves 1. producing services for internal and external consumption, 2. consuming cloud services where ever possible rather than reinventing the wheel and of course 3. hosting data and compute on the cloud. In other words you have to be part of the cloud ecosystem.

Big Data to Lean Data

Analytics has become a man power intensive activity.Man power is very expensive on top of it Analytics field is replete with jargon, as a result it has going beyond the reach of smaller companies. With today’s advances in technology there is no reason why a large part analytics can be automated. So taking cue from “lean manufacturing” and “lean startup” we should also aim towards “lean analytics”.

A large part of lean analytics would involve automating various steps, reducing wastage of analysis and data, and building compute-efficient models. It all starts at data gathering stage. Once the data is machine readable most of the other analytics can be automated.

There are ready tools for data munging, ETL etc. Statistical analysis can be completely automated. Even model building and prediction can be largely automated. So data gathering is the key for making analytics lean. So here are the hygiene factors for lean data collection:

  • Reduce human error as much as possible through data collection automation (for ex. have drop boxes in forms instead of text)
  • Try to figure out data model in advance but keep scope for unstructured data
  • Dont collect all the data you can get. You have something doesn’t mean you need it
  • Set a process for cleaning before storing




AI in Business Analytics

Automation is the biggest growth driver for any industry. Much work in being done in automation of various aspects of business: work flows, project management, marketing etc etc.. Business Analytics is lagging quite behind in this though. Much of business analytics today involves manual data munging, analysis by an expert, modeling through the use of statistical tools and a final decision making by a manger. Bulk of this process can be automated. Imagine a system that looks at data, converts data into usable format, identifies trends, presents those trends in human readable reports and generates actionable items that can be read by other systems. That is where analytics should head to.

This vision will involve coming together of several fields of science. There will automated statistical models identifying trends. There will be business expertise converted into machine readable rules. There will be natural language generation and interpretation. There will be smart dash-boarding. And of course there will be Databases, APIs and UIs:) All of this together will be what I call artificial intelligence in business analytics. Parts of it is already happening when an e-commerce company decides to give a customer discount on a particular product on the fly or when an algorithm decides to do a particular trade in financial markets. In fact eventually it can lead to complete automated decision making in all aspects of business! The only control remaining with humans being the plug to the machine.

Fintech Revolution aka Unbundling of the Bank

Banks are entities with several disparate services. This has been for legacy reasons as people trusted banks for all money related business (except may be insurance). thus banks offer payments, deposits, loans, wealth and asset management, broking, custodial , transaction advisory, investment banking and other varied set of services. This variety could have also been necessary when economies of scale are not achievable in one single line of business. This is no longer necessary in the current scenario when economies of scale can be achieved through technology. Thus there is a scope for breaking the bank into several smaller entities serving specific need of the customer. This is what is exactly happening right now in the fintech world. And this is how the bank is getting unbundled:

  • Digital wallets and gateways are replacing payment solutions
  • Online liquid funds are replacing deposits (especially in China. Not yet happening in  India)
  • P2P lending and automated lending are replacing traditional bank loans
  • Robo-advisors are taking over traditional wealth management
  • Penny brokers are replacing broking services. this revolution has to still disrupt OTC broking though
  • Several crowd sourcing platforms are replacing investment banking at least for smaller firms

Some areas where fintech has still not made a significant mark is:

  • Smart beta investment products at very low cost
  • Custodial and security services are still primarily handled by banks as lot of trust is involved
  • International payments because of regulatory issues

The only way for a bank to beat the onset of fintech revolution is to be part of it. This involves several things:

  1. Upgrading technology to latest stacks with more open architecture and leveraging services of other tech players
  2. Digitising the offerings
  3. Using data analytics for increasing customer LTV and reducing inefficiencies

Get every new post delivered to your Inbox.