Thanks for visiting the blog is moved to our company’s official site at: http://g-square.in/blog/
Thanks for visiting the blog is moved to our company’s official site at: http://g-square.in/blog/
Decisions in organizations are made at strategic, tactical and operational level. Analytics for generating insights to take decisions also need to be done at these three levels. The difference between these three will be the time-frame and scope of the impact of decisions.
For example senior management deciding on product features will be a strategic decision. A branch manager calling up a sales person to sell a particular product could be tactical. An e-commerce site recommending a product to a live customer is an operational decision.
We believe different analytics approach needs to be followed for all three levels of insighting. Strategic analytics can be done by analysts using some level of automation. Tactical analytics need to be driven by robo analytics where tactical team is given insights on a daily basis through automated robo-analytics and machine leaning. Operational analytics on the other hand should be completely automated where the analytics process and decision making both are done by the machine.
Our company’s tools right now fall in the second category where we enable tactical decision making and insighting through robo analytics.
This cartoon was drawn by Jon Carter.
Data scientists and analytics professionals have been the new supermen. But the party didnt last long as automation tools will replace manual data analysis soon. The cartoon was inspired by a recent post: Six Very Clear Signs That Your Job Is Due To Be Automated. The article sets six criteria to judge whether a job can be replaced by a robot.
If you closely examine all of these criteria are applicable to a data scientist’s job (especially points 2, 3 and 4). So..
Analysts move on the robos are coming..
Democratising data is the buzz word in analytics right now. I want to extend the concept to say analytics has to be democratised as well. Right now teams in business rely primarily on an analytics or a BI team for all their analytical requirements. Which is sort of outdated as modern technologies help automate a range of analytics process and help end user directly access analytics (apart from data and BI). Organisations are not moving towards such an approach yet because of legacy issues and lack of alignment of interests.
Imagine a world where apart from data, the ground force has direct access to analytics (descriptive, predictive and prescriptive) on a realtime basis. Actionable insights are directly delivered to salesforce/operations teams for taking in-time tactical decisions. Such a scenario is vastly superior to what happens right now in ogranisations, where a central team runs analysis once in a while and identifies actionable outcomes based on its own judgement. Democratisation analytics not only helps in improving efficiency of analytics but also reduce significant costs.
This model of self-service analytics can help make analytics available through out the organisation. But to implement such as solution there needs to be a change in the mindset of organisations, especially in the central BIU/analytics teams. Where these teams should become enablers of analytical change rather then controllers.
There are several products in the market for implementing self-service tools. Gartner (an It think tank) suggests that this could be the future on analytics – “This multiyear shift of focus from IT-led reporting to business-led self-service analytics passed the tipping point in 2016, which has forced a new perspective on the definition of a BI and analytics platform, and consequently, has significantly re-ordered the vendor landscape as represented in the Magic Quadrant.”
G-Square offers several products of self-service analytics most notably Narrator (Smart BI) and Clientrator (Customer Analytics tool).
In continuation of this series on AI and philosophy, here I want to about the nature of meaning. As humans we easily interpret language and extract meaning out of sentences. Transferring this skill to machines has been a hot topic recently. Most attempts at doing so only act superficially. For example some focus only on grammar, some on sentiments carried in the sentences etc. To really make AI understand language we have to look at what “understanding” actually means. Fortunately there has been a long philosophical background to this already.
Both Indian and Greek philosophical traditions have a long history of speculation on the nature of language and meaning. In Indian philosophical tradition there have been two stream of thought. One stream claims that words independently carry meaning (much like in modern semantic theory). The other school is of the view that the words themselves do not carry meaning until they are placed in a sentence. That is meaning of a word is driven by overall context of the sentences and composition it is found in
There could be several theories of meaning that can be directly used in AI. For example one of the philosophical theories states that meanings are purely mental contents provoked by signs. In other words language is a way to access contents stored in the memory of brain. This is referred to as idea theory. Another theory, the pragmatist theory, says that meaning (or understanding) of a sentence is determined by the consequences of its application. In other words language is a way to invoke certain functions.
A thorough study of these philosophical theories of meaning is required to be able to leverage some of this work in AI especially in the case of NLP.
|Resources and functions are two different ways to access the functionality of server||Everything is a resource|
|Functional calls will not have side effects in FAST architecture||Any call can have side effects|
|A function can be accessed through “parameters”||Access is through get and post to resources. URL parameters are inline with RESTful philosophy|
|Both architectures allow for stateless interaction between client and server|
|Both architectures allow for interaction over HTTP|
|A session needs to be maintained between client and server||No session needs to be maintained|
|Calls can have side effects / change sin state||Only RESTful calls will have side effects, pure functional calls will not have any side effects|
|The API needs to be known in advance||API can be discovered during interactions|
|Strong coupling leads to less scalable design||No coupling can be used to build scalable systems|
FAST architecture is superior to RESTful architecture in several ways:
Following up on previous two posts, we propose a new architecture for combining functional paradigm with RESTful programming. We name it FAST architecture.
Role of each piece of the FAST server is described below:
The key ingredients of lambda machine are:
This architecture allows following calls to the FAST API:
Continuing on the previous posts, RESTful paradigm treats everything as a resource (data and functions both). Any methods applied on a REST server will modify the state and subsequent methods can yield different results. This is not compatible with functional style. If resources are classified as data and functions, then one can implement mutable data and functional calls without any side effects.
Parameters can be passed as part of request or as data objects with URIs. This integrates functional programming with RESTful architecture to some extent. There are two key innovations here:
Functional resources might remind one of RPC but they are fundamentally different as we need not maintain session between the client and the server and all the intercation can happen over HTTP. This architecture is ideal where data and computation both are equally important.
This is the first of a series of blog articles on combining functional and RESTful paradigms.
RESTful programs are by definition resource oriented. Resource is an abstraction of a computational object. RESTful resources can represent a physical entity, an informational object and even an abstract entity. Resrouce oriented pardigm draws inspiration from the Web and hence with it carries the bias toward documents. For example it will be easy to make call like “Get /mylocation” in a RESTful setup than to make calls like forecast weather at latitude 74.34564, longitude 34.0900 on 27th December 2016. There could be some RESTful ways of executing the above query but all of them are workarounds and do not adhere to the RESTful spirit.
As an improvement to RESTful services, I think network computation should be slit into two parts:
This architecture allows for segregation of data mutations trough REST methods and immutable operations through function calls.
Each of the programming paradigms have their own use. Although I’m not a big fan of object oriented programming. Functional programming is highly useful when you want to achieve high level of abstraction. This helps in segregating implementation from specification. But the problem with functional programming is the inefficiency. The inefficiency primarily comes around because of lack of control on implementation. This can be surmounted over time as smart interpreters are built. Iterative programming on the other can give that control to the programmer right away. It is also sometimes easier to build interative programs as over time programmers have been used to coding in this paradigm.
In contrast to these two types of programming, OO does not offer much advantage. The only new thing that it brings to table is the ability to maintain state as part of objects. But that can be achieved through other means.
So my approach to writing analytics code is to use iterative programming but keeping in mind functional paradigms (like abstraction, statelessness, use of firs-class functions etc..). This ensures the code is clean like a functional program at the same time has the efficiency of an iterative program. Python is an ideal language for achieving this.