There is a big hype around ICOs. Although there are several detractors, there is some merit in ICOs. Trouble comes when companies do ICO without the need for a separate token, just to get rich quick! Sames hold for blockchain where companies used blockchain to create a hype but do not fully implement the core concept of blockchain: distributed-decentralization.
To my mind the real usecase for blockchains and ICOs are in community driven projects where there is no for profit corporation involved (even if it is involved it will get out at some point). Ideal case in point is Ethereum. Its a public blockchain without any company trying to generate revenues and profits. It is managed and promoted by Ethereum foundation. Of course the founders made deservedly enough money through Ether coins.
In other words, what open source was for proprietary software blockchain should be for cloud and crypto-tokens for equity shares of companies.
AI, Analytics and ML related articles are moved to our company’s site: g-square.in/blog
This blog now will focus on distributed and decentralized computing.
Thanks for all the support!!
Decisions in organizations are made at strategic, tactical and operational level. Analytics for generating insights to take decisions also need to be done at these three levels. The difference between these three will be the time-frame and scope of the impact of decisions.
For example senior management deciding on product features will be a strategic decision. A branch manager calling up a sales person to sell a particular product could be tactical. An e-commerce site recommending a product to a live customer is an operational decision.
We believe different analytics approach needs to be followed for all three levels of insighting. Strategic analytics can be done by analysts using some level of automation. Tactical analytics need to be driven by robo analytics where tactical team is given insights on a daily basis through automated robo-analytics and machine leaning. Operational analytics on the other hand should be completely automated where the analytics process and decision making both are done by the machine.
Our company’s tools right now fall in the second category where we enable tactical decision making and insighting through robo analytics.
This cartoon was drawn by Jon Carter.
Data scientists and analytics professionals have been the new supermen. But the party didnt last long as automation tools will replace manual data analysis soon. The cartoon was inspired by a recent post: Six Very Clear Signs That Your Job Is Due To Be Automated. The article sets six criteria to judge whether a job can be replaced by a robot.
- It Involves Little Physical Contact Or Manipulation Of Things,
- It Involves Answering Data-Dependent Questions,
- It Involves Quantitative Analysis,
- It Involves The Creation Of Data-Based Narratives,
- Consistent Performance Is Critical To Your Role Or
- There Are Well-Deﬁned Rules For Performing The Work
If you closely examine all of these criteria are applicable to a data scientist’s job (especially points 2, 3 and 4). So..
Analysts move on the robos are coming..
Check out our robo analytics smart insighiting tool at: Narrator
Democratising data is the buzz word in analytics right now. I want to extend the concept to say analytics has to be democratised as well. Right now teams in business rely primarily on an analytics or a BI team for all their analytical requirements. Which is sort of outdated as modern technologies help automate a range of analytics process and help end user directly access analytics (apart from data and BI). Organisations are not moving towards such an approach yet because of legacy issues and lack of alignment of interests.
Imagine a world where apart from data, the ground force has direct access to analytics (descriptive, predictive and prescriptive) on a realtime basis. Actionable insights are directly delivered to salesforce/operations teams for taking in-time tactical decisions. Such a scenario is vastly superior to what happens right now in ogranisations, where a central team runs analysis once in a while and identifies actionable outcomes based on its own judgement. Democratisation analytics not only helps in improving efficiency of analytics but also reduce significant costs.
This model of self-service analytics can help make analytics available through out the organisation. But to implement such as solution there needs to be a change in the mindset of organisations, especially in the central BIU/analytics teams. Where these teams should become enablers of analytical change rather then controllers.
There are several products in the market for implementing self-service tools. Gartner (an It think tank) suggests that this could be the future on analytics – “This multiyear shift of focus from IT-led reporting to business-led self-service analytics passed the tipping point in 2016, which has forced a new perspective on the definition of a BI and analytics platform, and consequently, has significantly re-ordered the vendor landscape as represented in the Magic Quadrant.”
G-Square offers several products of self-service analytics most notably Narrator (Smart BI) and Clientrator (Customer Analytics tool).
In continuation of this series on AI and philosophy, here I want to about the nature of meaning. As humans we easily interpret language and extract meaning out of sentences. Transferring this skill to machines has been a hot topic recently. Most attempts at doing so only act superficially. For example some focus only on grammar, some on sentiments carried in the sentences etc. To really make AI understand language we have to look at what “understanding” actually means. Fortunately there has been a long philosophical background to this already.
Both Indian and Greek philosophical traditions have a long history of speculation on the nature of language and meaning. In Indian philosophical tradition there have been two stream of thought. One stream claims that words independently carry meaning (much like in modern semantic theory). The other school is of the view that the words themselves do not carry meaning until they are placed in a sentence. That is meaning of a word is driven by overall context of the sentences and composition it is found in
There could be several theories of meaning that can be directly used in AI. For example one of the philosophical theories states that meanings are purely mental contents provoked by signs. In other words language is a way to access contents stored in the memory of brain. This is referred to as idea theory. Another theory, the pragmatist theory, says that meaning (or understanding) of a sentence is determined by the consequences of its application. In other words language is a way to invoke certain functions.
A thorough study of these philosophical theories of meaning is required to be able to leverage some of this work in AI especially in the case of NLP.
FAST vs. REST
|Resources and functions are two different ways to access the functionality of server
||Everything is a resource
|Functional calls will not have side effects in FAST architecture
||Any call can have side effects
|A function can be accessed through “parameters”
||Access is through get and post to resources. URL parameters are inline with RESTful philosophy
|Both architectures allow for stateless interaction between client and server
|Both architectures allow for interaction over HTTP
FAST vs SOAP/RPC
|A session needs to be maintained between client and server
||No session needs to be maintained
|Calls can have side effects / change sin state
||Only RESTful calls will have side effects, pure functional calls will not have any side effects
|The API needs to be known in advance
||API can be discovered during interactions
|Strong coupling leads to less scalable design
||No coupling can be used to build scalable systems
Advantages of FAST Architecture
FAST architecture is superior to RESTful architecture in several ways:
- Clean design: Segregation of resource management and functional computation allows for modular development and ease in testing.
- Efficiency: Parallelization can be done automatically by the server, thus taking the burden of efficiency from the client.
- Security: Both resources and functions can have authentication and rights.
Following up on previous two posts, we propose a new architecture for combining functional paradigm with RESTful programming. We name it FAST architecture.
Role of each piece of the FAST server is described below:
- REST API provides a mechanism to post, update, delete and get resources.
- Some of these resources could be generated dynamically in which case the REST API might interact with the Lambda Machine internally
- The Lambda machine exposes certain functions to the client
- The client can request resources or function calls. If it is a function call it is routed to Lambda Machine and if it is a resource requirement it is routed to the REST API
The key ingredients of lambda machine are:
- Any calls to lambda machine will not have any side effects
- The state inside lambda machine is immutable. Any mutable state is stores in the REST Server
- User can directly call lambda machine
This architecture allows following calls to the FAST API:
- Regular REST methods on resources: PUT, GET, POST, DELETE
- Apply a function on a set of parameters and get the results
- Apply a function on a resource
- Apply a function on a set of parameters and post it to a resource
- Apply a function on the results of another function
Continuing on the previous posts, RESTful paradigm treats everything as a resource (data and functions both). Any methods applied on a REST server will modify the state and subsequent methods can yield different results. This is not compatible with functional style. If resources are classified as data and functions, then one can implement mutable data and functional calls without any side effects.
Parameters can be passed as part of request or as data objects with URIs. This integrates functional programming with RESTful architecture to some extent. There are two key innovations here:
- Segregation of resources as data and functions. This segregation helps in identifying which calls will have side effects and which will not.
- Parameter passing through http. This helps in simplifying certain calls where a computation needs to be done on some variables.
Functional resources might remind one of RPC but they are fundamentally different as we need not maintain session between the client and the server and all the intercation can happen over HTTP. This architecture is ideal where data and computation both are equally important.
This is the first of a series of blog articles on combining functional and RESTful paradigms.
RESTful programs are by definition resource oriented. Resource is an abstraction of a computational object. RESTful resources can represent a physical entity, an informational object and even an abstract entity. Resrouce oriented pardigm draws inspiration from the Web and hence with it carries the bias toward documents. For example it will be easy to make call like “Get /mylocation” in a RESTful setup than to make calls like forecast weather at latitude 74.34564, longitude 34.0900 on 27th December 2016. There could be some RESTful ways of executing the above query but all of them are workarounds and do not adhere to the RESTful spirit.
As an improvement to RESTful services, I think network computation should be slit into two parts:
- Pure REST functionality where data is handled through RESTful services
- Functional APIs where computation on some parameters is done using functional programming paradigms.
This architecture allows for segregation of data mutations trough REST methods and immutable operations through function calls.