Github Copilot

A new beginning of AI interactions with machine, paired programming and learning through suggestions from machines

Release about few weeks ago, this tool appears to have suddenly popped out of no where. But otherwise a simple version control system in the cloud and most relied upon, after being taken over by Microsoft, has now come up with this AI based tool which appears to do the trick of machine based paired programmer “accompaniment“. At first the tool is provided on a limited basis to programmers or others. You will now be in the waiting list. Looks like many have signed up during this beta testing mode at this point. But what now? Did the coder die? Is this the end of computer programmers as some jump into conclusions fairly quickly ? Will this be the new era of coders who have no background in coding at all ? several sensible and non sensible questions are being asked by the developer community and is trending in the internet in recent days.

How this works is very simple

You are given a text box and from there you enter the key function or the command that you want to write a code. “It is simple as giving your idea to a paired programmer and he/she goes ahead and implements it” said, Debashish Banergee of a Paragon software company using this tool already in a productive way. “Although beta code, we see benefits” said Deb while talking to this blog. This becomes important when security comes into ply. If you are familiar with Tab 9, you should know how these work.

Where do they get the big data from ?

It is understood at this time that Copilot scraps and inspects the data from the repo of statscounter. So, as opposed to a search on statscounter, your best bet now is to do a copilot hit. Hit the copilot and you get better results than statscounter as I see it. Will this reduce the traffic on statscounter ? or is there another strategy in the making ?

Is this the end of the beautiful coder ?

Not really!. Many a times, such scenarios has popped up within the IT sectors. Earlier it was paper reduction then no cash and now AI based systems such as copilot giving an impression as to no individual is necessary. The above is an ideal case. We always need an individual to troubleshoot. Can anyone code? yes, sure.

Whats it for Stack Counter?

Still the search would go to stack counter from experience coders for sure. What copilot gives you is suggestions which may not be required for an experienced coder anyways. Do you agree or no?

AI Model and transformers utilized

The plugin for visual studio code uses GPT-2 transformation training model, the third generation in the GPT series as it is said to be. Its internet scrapping and producing human readable text is impressive. With the above, things look pretty good in terms of expectations for copilot. However, there are talks within circles that says, copilot also searches private repos within Github. If that is the case, there could be code delivered by copilot that could be otherwise not given permissions for derivates or produced code. Thus such code will start appearing on different applications, some say. May what be the case, this copilot or paired programmer as they project it is surely a second step towards AI based applications. Unless robots appear and we see how they appear we won’t be able to stop programmers rising.

Leave a comment. I get a fair amount of hits on my blogs which has 100’s of different articles. Please feel free to read them. Subscribe if you would like to be updated with totally fresh articles periodically.

For inquiries, interviews, write-ups please email me

Posted in Big Data BigData Cloud | Tagged , , , , , | Leave a comment

History Of Data Breach Since 2005 – Interesting Chart. Dust Rising / Scatter Plot! Take a look!

Interesting Chart showing data breaches, companies and total number of data records stolen.
Observe the total number of records against the company. The outlier is the financial organization where money plays and TJ Chain Where again goods and money plays.
The chart is interactive. If you take your mouse pointer towards each points in the graph, it will display the total number of records. X Axis shows the company names. Source and courtesy of Privacy Rights Clearing House

11,717,204,381 Records Breached Since 2005

Posted in Big Data BigData Cloud | Leave a comment

Digital Sprinkling – An algorithm that gets computers define its own. A think alike society yet different.

Digital transformation is heavily taking shape. Large and small organizations are battling the outburst of data and trying to derive meaningful information from structured and un-structured data. Behind all these, publicly facing collaborative applications, mainly social media tools are streaming heavy volumes of data. If enterprises miss these streaming of data being sucked into their revenue generating blood stream, they may not be able to row across to the tresure island. On the other hand social media cold wars are being fought. At times, we see a heavy weight integration and at other times we see light weight integration such as just the OAUTH types. These integrations at times make us believe that there are no wars being fought but everybody are in for making money. So be it. We are now going through a second generation social media wave.

Independent study in coordination with stencil research highlights and reiterate that we will eventually be replicating content at mega volumes. So far it was just proliferation of content across the internet. But now a new wave is sweeping the internet and that is, system defined algorithms that replicates content by deciding which content will be good for you and this content will be delivered and so will it be delivered to thousands of other people. All of a sudden you become part of a chain who will all be seeing same content and you will all start to think in a similar way. Your views on politics, people and places will all be the same. Petty soon you will start living in a world where you were not there in the first place. Sounds all very paranoid right? Read on.

  1. Joe sends an email to Stacey a link is contained within that email to a youtube or vimeo video
  2. Joe chats with Stacey and sends content or mentions about some news article again
  3. Joe uses social media share of Steve’s content to Stacey which was a purchase done and so on collaborations begin to happen and continues..
    Because Stacey has now visited the link sent by Joe and has already seen the video link or read the email or even visited a url, this information gets cached within Staceys system. This content will now make Staceys system suggest similar content. We should be knowing that since Joe has also seen the same content, the suggestions for Joe will also remain the same. This algorithm is already existing for ecommerce applications and is called as up-selling.

    Digital Content Sprinkling – Infographics

  1. To drill down a little further, what is going to happen is, slowly the content shared and the content related, in the form of up-selling, suggestible content, persisted information about a video, url, texts or audio etc; all will start appearing in synchrony with stacey, jack and Steve and many others in a spree of hundreds of replication routines. These content replication algorithms will utilize cookies and collaborations or link shares or video shares or simply content shares as mentioned above to show same or at the least similar type of content to the users and more. Thus, near to same content will be seen and read by a society of people. These societies will start to think and act the same. Enterprises, companies who make a selling will start to exploit these societies. All of a sudden, you will further see sudden shifts in thinking within factions. These happenings will further digitally divide societies because, content replication will not be confined to just product sellers. It will creep down to explicit content and to vested interested parties, politics and so forth. Already we are seeing political interference from external nations. These content replication strategies may have already been utilized by syndicates.

    At this point, less than 20% of such content gets replicated said a research scholar from stencil research, an IT research firm from the San Francisco, Bayarea, California.
    These replication of content has nothing to do with Artificial Intelligence. It is what can be thought of as a secondary re-purposing of content. Its just happening automatically. Its like the lightning strikes and then there is thunder, which was not the purpose anyway. Thunder just happened by accident.

A little more

A. Search engines today are making use of cache’s or simply cookies that sits within local machines or to be simple, your machines. With various compliance requirement such as CCPA, GDPR it surely gets cached in your system. (More CCPA across nations will evolve)
B. These search engines would have also stored cookies in my machine/s and a 10000x other machines too. Yes? Ok
C.These search engines will also have many other tools and services that keep track of what you do and whom you talk to or to whom you send emails or with whom you do a chat with and what your location is. Even if your location is switched off, AI based systems understand where you are from. A simple IP tracking will give more details. Yes? Cool. 🙂
D. Now once a content starts getting shared, email, social media, chats, similar content starts appearing. Then there is synchrony of these contents slowly starting to evolve and thousands will start experiencing the impact of the content. Pretty soon, you start thinking of buying a TV and all of a sudden your friend starts thinking the same. You want to think about a political view, your friend starts thinking the same. Imagine the impact. How people can change your perceptions. Psychologically when a person buys and this is told to another, there is high possibility of the other person buying the same stuff. Digital marketing utilizes these. Reviews in many portals help in this kind of digital marketing or DM.

What should happen?

I am a small pebble in a mountain and I have no shining materials either. Therefore I say it loud!. There are giants around who can shape the world as technology evolves into most modernized forms. Billionaires, tech giants and philanthropists have already talked about the perils of building artificial intelligence systems without ethics. Only governance and regulations can stop such cruelty with applications using AI. There should be philosophers who understand technology. A new breed of philosophical and technological saints who understands art, literature and history combined with social changes happening should evolve. Such a man or a woman should be searched, appreciated and brought to front. He or she should be called out by the world. Like Einstein this person could be one among the billions the world creates.

As far as technology itself and education is concerned, technology is basically neutral. It’s like a hammer. The hammer doesn’t care whether you use it to build a house or whether on torture, using it to crush somebody’s skull, the hammer can do either~Noam Chomsky

“To change the world we need to combine ancient wisdom with new technology” ~Paulo Coelho

Its not a faith in technology, its a faith in people ~ Steve Jobs

We are living in a separate world ~ Bob Marley

1,311 Namaste Illustrations, Royalty-Free Vector Graphics & Clip Art -  iStock
Posted in Big Data BigData Cloud | 5 Comments

The Neo revolution of the Opensource movement

This article has appeared in

If you are following the #startup moves within silicon valley, You should be knowing way too well that, #startups are racing towards releasing new application terming it under “MVP” or “Minimally Viable product”, a term that has been existing for quite some time now, at lightning speeds. With lack of availability of resources and churn rates at never before seen numbers, even MVP’s are consuming more time thereby potentially disrupting applications dying inside trenches, without seeing the outside world.

Testing is one of the key steps that cannot be evaded, no matter whether you have adopted TDD or investing on complete automation scripts. To that extent, testing becomes a major component of the software life cycle. What was traditionally a simple automation script or a simple recording and running of tests against your application has now become much more complex. Tests by itself has become an independently owned software development by itself. So, one cannot discard the importance of testing; may you talk about traditional waterfall model or the infamous and much blotted up two weeks iterative #devops model. Whether an iterative two weeks deployment model is a good approach or not, is not within the scope of this article however, the intensity of testing is and/or the massive importance of testing is. As more and more globalization is taking shape, the need to provide more time for testing becomes important or in other words, as more and more layers are built over monolithic applications, there exists the stringent need for building methodical tests around it. Here is the “point blank” statement. Opensource frameworks and pre-built applications have already gone through such harness systems. It brings along with it, ample time that can be saved. Do you concur?

Collaborations are very important. Some say collective intelligence. This invaluable collective intelligence cannot be assimilated without, very own intelligent leadership. Especially with diverse groups within each organizational units and multi-cultural diversity existing within a global distributed development network. If this doesn’t exist, just tap into it. At the most you will get 24 hours or near to development time or work hours. When one sleeps, the other is awake!

With data that has blotted up, algorithms and possibilities of data modeling that can predict near to real time predictions, has also risen. Software engineering has never become so complex in nature. No matter what tools you use. Given this scenario, what if we use already tested frameworks. Java is a very good example of an opensource development framework. With C++ evolving as stated by study groups and stencil research acknowledging the same, existing opensource frameworks that has gone through mass testing through collaborative efforts become key factor in success of startups. Available runtimes like Apache web-server plays a leadership role in this situation.

As in the case of any law of nature, the big shark eats the fishes. An example of this is the takeover of the JAVA licensing model by Oracle. Will the opensource die eventually? Will the valuable time spent by the open minded engineers or developers will be futile and thrown down the drain? Will big fishs gulp the small fishes? One may argue that, the openJDK is as good as the licensed version and that the foundation of the underlying code is the same. Yes indeed. Nevertheless as long as the the brutal nature of monsters gulping the little things around exist, one can never say anything about the future. As time progresses, more and more enthusiasts and broad minded engineers arrive at the border crossing, ready to enter the land of opensource. It is a good one liner for the monsters to know, “too many hands make lite work for you!” An old poetry I remember. A kintergarten poem. Here it is…
If all the men were one man,
What a great man that would be!

Great leaders have always laid out a green path for the people to walk. Ronald Reagan Said ““The greatest leader is not necessarily the one who does the greatest things. He is the one that gets the people to do the greatest things.” –President Ronald Reagan”

Its time to see the beauty of the world. Let the leaders make the people do greatest things and not snatch credits and leave people in agony. For ages man has been trying to live together and coexist together. Yet, there are fights and menaces. Let the servant leader lead the path. Let the flowers bloom in all trees.

I said to the almond tree, ‘Friend, speak to me of God,’ and the almond tree blossomed. !Nikos Kazantzakis

Why should the committer to opensource, the man whom I call “The Opensource man”, be “committed”. You look from any angle, its only about broad mindedness, its about development of the society he or she lives in, its all about the unity of us all and yes, he or she should be open for him or her to gain herself.

The Aeroplane and the radio have brought us closer together
The very nature of these inventions cries out for the goodness in men – cries out for universal brotherhood – for the unity of us all
 ~Charlie Chaplin

Thank you,
With warm regards,

Posted in Big Data BigData Cloud | Tagged , , , , , , | Leave a comment

Disruptions, accusations, winning the game and innovation- What a small story!

Disruptions, Accusations, Winning games and Innovation. Board the next train!

Click the image to read more.


Posted in Big Data BigData Cloud | Tagged , , , , , , | Leave a comment

Data Visualization in its art form. Know the future that god only knew.

Listen to the Abstract of the session or read the text below.

With exponential growth of data, a new kind of problem is also in the rising; and that is the existence of large volume of useless data residing within useful data. Through filtration system, meaningful data needs to be suctioned out before any analysis. An independent study shows that there may be core information residing within voluminous data that can even change the course of life. For instance health technology seems to be the forerunner of innovation in the past. While this is not the case anymore, in the past data that was stored within these mega systems contains core information that can predict outcomes, if suctioned out or filtered out in its purity. But the sad story is that such core information is hidden within impure and unwanted or useless data. Therefore analysis becomes difficult.
The second part of the story is, when and when this data is filtered out, it will be residing as a huge ocean with no regularities. Here comes visualization in its art form.
This 45 minutes session makes an attempt to discuss this by throwing a base idea on the wall and drilling down further.
Imagination is most important than knowledge!
Register by tweeting me @sunnymenon. Only 25 Seats Available. The outcome of this is a design and move towards an opensource framework and thereafter diffusing the innovations across the technical, data science and visualizers communities.

Posted in Big Data BigData Cloud | 11 Comments

Protect your code, no matter opensource or not; DevOps, continuous integration and what not.

If you are battling to keep up with trends in the software industry, by continuously releasing enterprise products, adding more functionalities and improving user experiences, then someone needs to focus on who is protecting the code. Somebody needs to look into the process and methodology of how these code quickly gets built and rolled out and still protect the code.
Welcome to the world of coding, coders and a whole million lines of code.

“Oh I am not a monster, Its just that I am ahead of the curve” ~ Joker in Bat Man. 🙂

Most often so, that which is usually done by operations team within large enterprises, today, a term called “DevOps” has set its large foot. Code is not only protected but deployments are continuously happening. What is being seen is, there are no huge benefits of two weeks roll out. According to an independent study, if these two weeks of roll out is not carefully planned, you are bound to constraints that will extend your life cycle of the project in general. The end results that you would have attained within say X amount of time would cost you X+N amount of time. So check the plans. A small tab sheet for the devops is helpful anyways.

Are you having that tab?

When Georgi from this startup company came to me asking for help in safeguarding the huge set of code base, I didn’t realize that the damage has already been done. There are things that these guys have to do and have to do real quick, before it goes out of hand for a solid spin. You see, Georgi was smart enough to approach with his little doubts on issues he thought may pop up. It was a wise move. So here is the use case and the solution provided. Total cost involved, number of people, time lines and methodology adopted for implementation and delivery has been highlighted. Read on.


  1. 16 Developers with 7 developers additional remote.
  2. 5 Testers one onsite and rest remote.
  3. Four different components; Front End UI, Event brokers/messaging layer as a transport layer to front end and back end platform, the database access layer and finally the platform by itself. 32 Integration check points including connectors to search repos, analytics repos and posting data to other 30 internal and external systems.
  4. Two to three weeks delivery cycle. Technically something was rolled out every two weeks as per Georgi who was managing all these activities and was the direct report.
  5. Back up servers not timely but on demand.
  6. SVN and Git local repos. However no synched up code as there were multiple developers checking out code from the same branch and same class files or code.
  7. The development environment was using Java, Javascripting, nodejs, rabbitmq, MySQL, Solr, Lucene, zookeeper, spark and spring boot and hibernate, redis, nginx and tomcat. Containerization was done using CoreOS and docker. Eventually they were planning to move towards one and consolidate it. I am working on evaluating CoreOS and docker for them and seeing which suits their environment.
  8. Currently there performance is being assessed through custom built performance tools and network access and speeds are tested through common tools and simulated users originating from outside the firewalls.
  9. Build tool is Maven combined with Gradle , Chef recipes, bamboo and jenkins interacting through hooks to Git. Also there is Bug Reporting tool for which integrating is asked for. SVN is standing alone with all javascripts being posted there. Reason was, We began version control and management with SVN.
  10. Cloud enabled.

Solution Visual

Posted in Big Data BigData Cloud | Leave a comment

Dockerization – Scalable architecture for high volume web application AND an open statement to the enterprises.

With containerization gaining momentum and kubernetes promising better deployment models, a question first rises; what kind of a model should one support when it comes to large scale streaming type of applications?
High volume use cases are understood by all and many of them have talked, others have been able to deploy high volume, huge scalable architecture model.
Dockerization or containerization is much needed within the healthcare and financial domains for sure, not to mention manufacturing and retail. However, feeble and shabby manual installations prevail within such organizations.
It is some times sad, pitiable and annoying to see such haphazard and immature deployments. People who have deployed such models would have been better off without such deployments and would have invested on learning about containerization. Nothing but words and BS flow from supposedly architecture discussions while consultants, engineers and vendor companies struggle through, spending half the time battling compliance and policiy issues and above all the jurisdiction problems such as who will do what and so forth.

Reminds me of a quote :-
“People who think they know everything are a great annoyance to those of us who do” ~ Isaac Asimov.

Its ok not to know; but it is completely stupid to oppose a budding model and kill it instantly.
As technology shifts and evolve into something beautiful, it is important for people to embrace aggressive technical discussions and forget emotional disturbances that can interfere with personal agendas and issues. Let there be emotional intelligence rather than personal emotional turbulences. Not many come to the enterprise to marry and please hey, nobody is inside to fight with people either.
By writing off, saying, such stupidity will exist within this world has now come up to a point where one needs to really “think differently“, isn’t it time to “think differently” ?

A search on the internet on scalable models or large scale deployment give architecture descriptions and youtune like places spending time on details of containerization internals and so forth. But not much exist in the form or diagrams or architecture model that can be adopted to simply use it. Individuals with eagerness and enthusiasms have been shot at during these discussions and have lost their flames. Let me stop before the subject diverts.

Below see if the model makes sense. Please feel free to give your opinions. If you have questions, do write it here. Collaborate for the sake of knowledge THEN COLLABORATE FOR THE SAKE OF MONEY.

<Please leave a comment or question. Feel free to give your opinion.>

******Architecture diagram below*****CLICK TO VIEW AS LARGE IMAGE*******


A few points to note.
1. This is a single unite within a cluster( I don’t want to call it worker because there are multiple workers within multiple containers here)
2. Number of boxes DO NOT represent number of components such as HA PROXY or NGINX. Containerization takes Docker into perspectives. Changing the box to COREOS may not be that of an efficient model.
3. Number of nodes is provided as an approximation (125 nodes)
4. Within a SWARM environment, nodes can be reduced. This model depicts a single unit cluster. (Worker in generic term, however I do not want to call it a WORKER Node)
5. This is to be viewed from an Application standpoint and not from INFRASTRUCTURE standpoint.

Leave your comments, whatever it may be. It is from the many mistakes that one learns a lot. If you ask, I will reply.

Thanks again.

Posted in Big Data BigData Cloud | 8 Comments

Eight things in the design of Apache Spark hadoop ecosystem.

There 8 things while designing an Apache Spark enabled application. Porting to a SPARK hadoop eco-system is an important step that is dictated by the need for streaming capabilities and extreme speed of execution. Apache SPARK uses clustering algorithms and can be used with HDFS making it a composite architecture. Unless you understand the business process and the incoming data, it would be in-efficient to build such architecture. Remember, from bigdata volumes comes value and NOT traditional reports.

1.SPARK relies on in-memory execution of tasks and storage. Because of this nature, it is important that you design your system having this thought in mind. Processes need to be built with this in view.

2. These days, writing in Java could be more efficient from resource standpoint and from the point of view that Java does its own concurrency better. Just because you have several API’s built on SCALA it need not necessarily speed up your execution. Therefore it is worthwhile to think of writing in Java.

3. SPARK architecture, may it be in the cloud or standalone, as it uses the in-memory space for data and executors, think about the heap size. Increasing heap sizes continuously to get it executed may reduce efficiency.

4. Using User Memory is not recommended unless your architecture really demands it for some core extremely high speed streaming needs such as in the case of fraudulent activities where a huge segment is to be detected OR a failure of a system within your APPLICATION cluster.

5.Take advantage of Unified Memory Management. Spark 1.6.x and above needed. This type of management appears to be using memory in a more dynamic way where the executor and data can push the limits if needed rather than a failure.

6.Consider nodes as individual machines. This will help in your infrastructure planning because every Spark executor in an application has the same fixed number of cores and same fixed heap size.

7.Before using Mesos, consider using hadoop/yarn.

8.Architecture is an art. So imagine, understand, absorb,design, travel through the design, re-design and architect, test small, test big, implement by deploying it in cloud;perhaps this is an ideal case and go live.

Meet me at #DreamForce #df16 . Know how would it benefit you and how to fix the meeting at .

Thank you.

Posted in Big Data BigData Cloud | Leave a comment

New Age Modernization, Dev Ops and You.

DevOps initiatives have already reached a turning point within enterprises. Many have set foot on the deep waters and they don’t know yet. This is mainly because of lack of involvement of third-party domain experts or external thinkers who can bring in fresh thinking within the enterprise. What should have been a change or shift in existing process to be more agile has ended up as nothing but an a deployment automation. Deploying applications to multiple systems are changing; no doubt, but that by alone will not suffice the need to be agile and be lean. This visibility across enterprises are a big problem. Those who can scale well will be winners in the future. They will be successful, however, to scale you need the visibility. One must know that the issues is not because of lack of technology or lack of processes but difficulty to scale well. This can only be attained from one type of personality; the people who are well versed in technology and equally good in understanding the business model and processes within companies. Somebody who has the vision about that particular domain. For egs, a company who is selling candies must think about what kids movies are being released in the market from time to time. For the aforementioned to happen, companies must implement systems or applications. Such applications must foresee multiple devices coming into the market space. Such is the new global modern market place.

Another key factor to look into, is the uprising of the cloud nature. It is predicted that in coming years, more and more enterprises will move towards the cloud. Cloud security is being redefined and this redefinition of cloud security will give rise to tremendous possibilities for small businesses to move to cloud environment and doing “cloud business”. Given the above scenario, it is very clear to all that the nature of doing business and therefore IT is evolving as a more complex work rather than developing a few lines of code. Integration, new technological breakthroughs, understanding multiple types of businesses and being able to scale, most importantly becomes important. Within this context, there are few things companies should do in order to prepare themselves. One of them which is being heavily talked about is the way to become agile. For agility Dev Ops is also being discussed and so is continuous integration and continuous deployment, Server Deploy which pretty much eludes to nothing but deployment to production. But this alone does not trigger change; because change is imperative for all of the above to happen. It should start somewhere. Here are top 10 things that you could think of and ignite the change.

Initiate the change through systems of engagements. Silicon valley based Author, advisor and a speaker, Geoffrey Moore wrote, where he defines about “Systems of engagement”. This touches upon modernization and IT. You can read it here in this wiki. You could also get video and presentations on this subject given by Geoffrey Moore.
Today processes are turning into Microservices. Traditional webservices still do exist but transformation is going on. This transformation is pushed from bottom up more than anything else, where developers are pushing this type of change more than business demanding it. Therefore it may be out of place many times. While microservices by itself has advantages, having it rolled out for already existing applications sporadically may cause investment drains and operational challenges.
Bring in new tools and services. Network has become part of the DNA in applications. When you have an application, the network participation in accessing the application has a major part to do with network and availability. So bring tools that help in managing several of these components. For egs, bring in Sophisticated API Management tools that are little different from already existing API management tools. Here check for network awareness and see how good they can detect nodes and automatically detect API’s and metadata attached to it. Visualization is another key factor to look for.
Identifying good resources have always been a challenge within software engineering. Who knows the code better is difficult at times to know. At the same time, a good coder may not be an appropriate coder for certain kind of work. Bring in leaders and mentors who could do this task. Inculcate positive feelings within teams. This is important when you are about to make a shift. Remember a change may be difficult, but the end result of change will be fruitful.
Hold tech talks like sessions. When doing so bring more people and allow them to talk. Make it interactive. Making it interactive requires the organizer to be a good leader. Develop leadership qualities. Follow linkedin leaders, influencers and bloggers.
Make meetings a necessity for the team members. Make it interesting. A leader must be knowledgeable and must be a leader of leaders today where social media pumps in thousands of things to the team members. What you are about to execute may be something that they have already seen and your initiative may be a thing of the past. For this to not happen, collaborate, discuss, share knowledge. Bring in new vendors, new people and brush shoulders with giants.
As mentioned above, make a linkedin or twitter account. Add them to your connections or follow to get their updates. One way to collaborate is by making your presence in social media because collaborations ignite innovation.
Conduct healthchecks or assessment services. What is the best way to know how good is your health? Simply by a healthcheck, you know a lot about your hidden treasures and how to utilize them. Constant check is important. Bring in third-party and get a second opinion or an unbiased view. If you want to do an ONLINE assessment to know where you are please click here.
Did you know, there are new kinds of decision makers. These decision makers don’t decide based on facts. They make decisions based on data that they see and the future state that of that data which predicts. These kind of decision makers are in the making. You should know this as we move up the stack.
A fish inside a fish tank only knows the nice generated oxygen and the abundant food given to it. What lies in the ocean and the vastness, freedom and the massiveness of its power to splash the water, the fish does not know. I would get to know by inviting them and hearing them talk about the ocean and blue sky. What I have seen farther and farther is by standing on the shoulders of giants. Remember it is not the end of the world.
For big data assessment on intuitive visualization, please connect with me. Using various bigdata visualization tools, applications can be developed that will address the data based, futuristic decision makers. Connect with me.

Posted in Big Data BigData Cloud | Leave a comment