Big Data’s Big Deal is HUMAN TOUCH, not Technology

Big Data Big Deal - Human Touch


I have been involved in marketing analytics work for some years now. It requires me to regularly talk to CXOs about their big data challenges, and their plan to leverage this data to improve business decision making. I am constantly surprised how much misconception exists among executives. All of them read about new technologies and platforms coming out of Silicon Valley that magically clean, organize, analyze and visualize data for them. As if, they just have to implement some technology, press a button, and insights would start flowing.

This is a myth. There is no such (magical) technology-based analysis. Period.

Big Data’s big deal is not about technology platforms – it is rather about appropriate human interface with data technology.

I am myself guilty of selling big data solutions under the facade of technology and platforms. In many ways, I have contributed to this misconception about Big Data technology. So, I hope you believe me when I tell you – Big Data’s big deal is not about technology platforms – it is rather about appropriate human interface with data technology. Let’s not continue to speculate that technology platforms would save the enterprise from all data problems. I have seen the most advanced technology platforms that exist today. There is only one thing I know – these platforms would serve no purpose if we don’t have trained data professionals who know three basic things – business/domain knowledge, analytical experience, and ability to embrace new data technology.


We all know – there is data everywhere. In the past couple of years, the world has generated more data than the prior civilization put together. Whether it is content posted on web and social media, data transmitted from sensors in cars, appliances, buildings and airplanes, or streamed to your mobile, television or computers, we are surrounded and overwhelmed by data. Advancements in technology are the main driver of this data deluge, but similar advancements have taken place in the technology to collect and store data. This has made it economical for organizations to build infrastructure to store and manage large sets of data. But, the real problem is deriving value out of this data and making it useful. This is where most of the stagnation is today. According to International Data Corporation (IDC), only one percent of the digital data generated is currently being analyzed.


Everyone agrees there is a big data revolution happening, but it is not about the volume and scale of data being generated. The revolution is about the ability to actually do something with that data. What used to take millions of dollars to first build the infrastructure and then hire really smart and expensive individuals to analyze data, can now be done in thousands. It all comes down to using the right set of new age technologies and implementing right set of rules (read algorithms) to deliver answers that weren’t possible earlier. This is where the new age data computation and analysis shines. We have come a long way to leverage machine learning, graph analysis, predictive modeling algorithms and other techniques to uncover patterns and correlations that may not be readily apparent, but may turn out to be highly beneficial for business decision making.

There have been vast improvements in how and what type of datasets can be linked together to capture insights that aren’t possible with singular datasets. An example that everyone understands is how Amazon links together shopping and purchase history of customers to make product recommendations. Along with linking of datasets, improvements in visualization tools have made it much easier for humans to analyze data and see patterns. These technologies are now making inroads into all types of disparate use cases to solve complex problems ranging from pharmaceutical drug discovery to providing terrorism alerts.


Insights can only be delivered by data scientists. And, there is a huge shortage of people who are comfortable with handling large amounts of data. Data collection is easy and cheap, and the general approach is to collect everything and worry later about relevancy and finding patterns. This can be a mistake especially with large datasets because there can be numerous possible correlations that increase the number of false positives that can surface. No matter how sophisticated technologies get, we need more data scientists outside of academia and working full-time on solving real world problems.

Machines cannot replace human beings when it comes to asking the right questions, deciding what to analyze, what data to use, identifying patterns and interpreting results for the business. Machines are good at doing fast computations and analysis, but we need data scientists to build hypothesis, design tests, and use the data to confirm the hypothesis. Traditional data scientists are not the solution, though. There are many generalists in the data science field who claim that if you throw data at them – they can deliver insights. But, here’s the reality – someone who doesn’t have knowledge of your business can only have limited (if any) impact. In addition, data scientists need to make sure decision makers are not presented with too much data because it quickly becomes useless. This is where technology and analytical experience comes in handy – techniques that help aggregate, organize, filter and cluster data are extremely important to reduce datasets to digestible chunks.


Company executives need to understand that human touch plays a fundamental role in the big data journey. Insights delivered by technology without proper human interface can put their business at risk, alienate customers, and damage their brand. Given the current advancements, it comes down to putting the right technologies to use and getting the right people (who know your business) in the room to derive value out of the ‘Big Data’. Is that an easy thing to do? What has been your experience?

You can also contact me on twitter @jasmeetio or on my personal website – Jasmeet Sawhney.

Source: Big Data’s Big Deal is HUMAN TOUCH, not Technology

Entrepreneur America’s Wish-List For The President

Entrepreneur America's Wish-List For The President

Presidential campaign and election has ended with all cheers and happiness for Obama administration and supporters. Now, comes a huge task, as claimed by both candidates, an effort to re-build America. We all know why America is called a land of opportunity and has a distinct edge that separates it from the rest of the world. It is the entrepreneurial culture that gives everyone a hope and opportunity to see ones dream come true. But with current economic turmoil, political instability and lack of focus, America is losing its luster as an entrepreneurial powerhouse of the world. We, all entrepreneurial immigrants have heard one or more stories about the brain drain happening in America. Several of my close entrepreneurial friends moved back to their native land to pursue their dream. American startups accounts for majority of jobs in the country, and yet political system is gradually ignoring its role and making America lose its edge as an entrepreneurial world. Many countries are exploiting this situation and providing great haven for entrepreneurs. So, it is a plea from entrepreneurs of America to the president to help bring America back to the map as world’s top place to build a business and create innovation. Focusing on following five things would go a long way in assuring American Entrepreneurs a successful run.

Immigration Reform:

We all are aware of America’s growing immigration problem and how it is impacting its positioning in the world. With overwhelming Hispanic voters going for Obama sent a clear message to both parties. Immigration reform should be a primary focus on your agenda to win an election. For math nerds: Hispanics made up 10 percent of the total vote and gave Obama almost three votes for every one earned by Romney. Obama may even have won a majority among Florida’s Cuban voters, who were once a Republican mainstay. For aspiring entrepreneurs stuck in immigration debacle, it is a truly agonizing battle. Consider crème-de-la-crème belt of immigrants who graduate from top universities, with top degrees and have to wait for 5 years before getting a permanent residency. What would this forced 5 year wait do? Demotivate, almost kill entrepreneurial spirit within and make you a worker from an inventor. Imagine having to wait 5 years by sticking to a sponsor company for your H1b. So, even if you have an entrepreneurial bug, you have no other way but to join some corporate job that is willing to sponsor your H1b, work your way through that company till you get your green card, then you can call the quits and get cranking on your entrepreneurial adventure. Is anything wrong with this logic? No sane and capable brain could afford wasting 5 primary years of their life to political loophole. This is the solo reason for many entrepreneurs to leave the country to other favorable avenues to pursue their dream. America is also facing a huge shortage in talent pool that could prepare America for tomorrow. Immigration reform will open pathways for this much-awaited talent to enter and disrupt startup world in US of A. So, Mr. Obama as we all immigrant entrepreneurs are angry and annoyed by the delay in immigration reform, we hope that it will make its way in this term helping America re-claiming its crown, when it comes to entrepreneurship.

Capital Access:

Want to invest in future? What better way to invest other than investing in American entrepreneurs, helping solve the problems of tomorrow. Why pay banks billions of dollars to bailout, that is not going to solve the economic crises. Using that money to crank innovation engine will go a long way by producing more business opportunities, more businesses and therefore more employment. Remember more than 80% of workforce is employed by small businesses. So, it is important to invest in small businesses and startups. President Obama should use his 2nd term to create more opportunities for American small businesses. As we have seen some much needed love from President Obama’s administration during 1st term towards U.S. Small Business Administration by allowing them bigger wallet for their loan size on their largest lending program. More federal programs should be introduced to help entrepreneurs in a similar manner, by providing funds, grants and contracts commercializing products and taking their business to next level. Other measures that are seeking the day of the light are the proposal to reduce taxes on startup expenses, extend a policy that allows businesses to deduct the cost of equipment and software, and expand government small business investment program by $1 billion. Some tax breaks will also go a long way in providing help with operational expenses for startups. So, capital access to startups is another big agenda seeking President’s attention.

Flexible and friendly regulations:

Regulations are another key area biting hard and making it almost impossible for any business to run smoothly. The Atlantic, and Yahoo! Finance jointly published an in-depth article:  “What Kills Small Business?  Let’s Ask Them. The article states that “69 percent of small business owners and managers say that complicated government regulations are ‘major impediments; to the creation of new jobs.” Entrepreneurs, startups and small businesses always need flexible and friendly regulations. Last 4 years had been swiped by too much bipartisan deadlocks, causing delays in every aspect of regulation easing on laws that help small businesses. Luckily, President Obama is a strong believer in investing in small businesses. His 1st term, did show some signs as Jumpstart Our Business Startup Act  was approved last April. This term should be used to promote and execute JOBS Act. JOBS Act helps small businesses gain access to financing with investor protection.  JOBS act also raises curtain on “Crowd-funding”, which is going to be another huge area to channelize liquidity to small businesses, critical to startup success. This law also creates an IPO on-ramp that gives young companies temporary relief from certain SEC regulations, making it easier and more feasible to go public. So, these 4 years should be focused in making sure that those programs see light of the day and help businesses get fewer regulations for growth.  Effort should be made to make possible the rollout for similar acts as soon as possible, so that small businesses and startups can make use of them and benefit from them sooner and grow rapidly without worrying about regulations coming in the way.

Boosting and fixing exports and trade:

This presidential election had us all listening to how unfair China gets with its trade policy. How big corporations take advantage of tax loopholes and take outside refuge to claim tax exemption, and report less revenue, costing America billions in taxes at the same time giving big companies and foreign corporations an unfair advantage. To make it fair for small businesses to survive, it is important for Obama administration to fix those trade loopholes and give everyone a fair chance at winning the customer. So, as promised, Obama administration should crack down on those laws giving undue advantage to a few. As per Obama campaign spokesman Adam Fetcher, looking ahead, the president “will make sure that no foreign company has an unfair advantage over American companies by offering more generous financing through the Export-Import Bank,”. The Export-Import Bank supported a record $6 billion in financing and insurance for small businesses last year, and more than 85 percent of the bank’s transactions were for small businesses. In 2010, President Obama also set a goal to double exports over five years and create the National Export Initiative to promote U.S. goods and remove trade barriers, expand access to credit, and promote strong growth worldwide. This initiative will also help local small businesses get the international exposure and an opportunity to expand. So, export and trade law fixes and boost will facilitate entrepreneurs gain competitive edge and provide an opportunity to expand to foreign markets. The president is also due on negotiating a new, expanded Trans-Pacific Partnership, a trade pact with nations including Australia, Singapore and Mexico, among others. So, fixing import-export issues will also go a long way in establishing a fair competition and hence healthy business.

Energize Startup/Small businesses initiative:

Obama administration should help in every possible way to promote startup. Provide as much opportunity, assistance, funding, work as possible. During Obama administration’s 1st term, $300 billion in federal-prime contracts have been awarded to small businesses since 2009, and one-third of all federal-contracting dollars in the Recovery Act went to small businesses. President Obama has further emphasized his interest for an infrastructure bill that could boost construction and engineering jobs across the country. The president plans “to create an economy built to last that relies on small-business growth, investments in entrepreneurship, advanced manufacturing and increased exporting,” Fetcher says. This is certainly a fine strategy, as is the Kauffman Foundation’s work on “Startup Act”. Startup Act proposes softer immigration laws for entrepreneurs, improving channels for early stage financing, improve broken patent system and to clear its backlog and remove barriers to the formation and growth of businesses through the introduction of automatic ten-year sunsets for all major rules. Startup America is another great initiative started to help build a community of collaborative learning within startups. More such efforts are needed from Administration to assist startup. Many private companies have also jumped into the pool seeing Obama administration do it. Mr. President, please build more such programs that helps entrepreneurs connect, collaborate and huddle to success.

So, notably 5 primary things that will unleash the entrepreneur’s ability to build world’s most disruptive startups and promote job growth is needed.  Mr President, this is a humble plea from an immigrant entrepreneur who is trying to make a difference. With your help, we all could flourish, taking America with us to a path of prosperity and success. By the way, congratulations Mr. President, you were a good president, hopefully this term will establish you as the President that got America to its Innovation 2.0 map and saved the world.

Originally Posted at: Entrepreneur America’s Wish-List For The President by v1shal

5 tips to becoming a big data superhero


Who’s the most powerful superhero?

Rishi Sikka, MD has a favorite and it’s one most people have probably never even heard of: Waverider.

Sikka, senior vice president of clinical transformation at Advocate Healthcare, considers Waverider the most powerful superhero because he can surf the time stream and predict the future.

Leading up to his presentation here at the Healthcare IT News Big Data and Healthcare Analytics Forum in New York, Sikka looked up the word “hero” and found that it has existed for millennia — it was even used prior to tongues we can trace — and the root concept is “to protect.”

Based on that definition from the Latin, and with a focus on population health management in mind, Sikka shared a fistful of tips about becoming a big data superhero.

1. Your power is looking to the future but your strength lies in the past and present. So healthcare professionals and organizations must assemble the data necessary to understand your current state of being, including knowing as much as possible about patients.

2. Pick your battles wisely. “All the great superheroes know when it’s time to move on,” Sikka said, pointing to the need for risk stratification and strategic resource allocation, which is “where big data and population health intersect.”

3. Your enemy has a name – and it’s regression to the mean. “I know it’s not very sexy,” Sikka said of his description of that enemy. He recommended that healthcare organizations consider the impactability of what they are doing, or focusing on where they can have the biggest impact. “I hope impactability will become a buzzword in the next year or two.”

4. Your superhero name is not … Cassandra. “It’s a lovely name,” Sikka explained, “just don’t pick it as a superhero name.” Why not? In Greek mythology, Cassandra, daughter of Apollo and a mortal mother, could predict the future. That was the blessing. The curse: Nobody believed her. “We don’t want population health to be an academic exercise.”

5. Don’t forget your mission. Every superhero is out fighting the bad guys, saving humanity, but sometimes even they can forget why they’re on this earth. “When we talk about population health we talk a lot about cost. We talk about bending the cost curve,” he added, “but I don’t know a single person on the front lines of care who gets jazzed up to bend the cost curve. The work revolves fundamentally around health.” Sikka suggested that healthcare professionals work to steer the dialogue back to clinical outcomes and wellness.

Sikka wound back around to the root of the word hero: “Our goal with respect to analytics, big data, population health,” he said, “is to protect, aid, support, those who give and receive care.”

To read the original article on Healthcare IT News, click here.

Originally Posted at: 5 tips to becoming a big data superhero

Big universe, big data, astronomical opportunity

30 Oct 2010 --- Open cluster Messier 39 in the constellation Cygnus. --- Image by © Alan Dyer/Stocktrek Images/Corbis
30 Oct 2010 — Open cluster Messier 39 in the constellation Cygnus. — Image by © Alan Dyer/Stocktrek Images/Corbis

Astronomical data is and has always been “big data”. Once that was only true metaphorically, now it is true in all senses. We acquire it far more rapidly than the rate at which we can process, analyse and exploit it. This means we are creating a vast global repository that may already hold answers to some of the fundamental questions of the Universe we are seeking.

Does this mean we should cancel our up-coming missions and telescopes – after all why continue to order food when the table is replete? Of course not. What it means is that, while we continue our inevitable yet budget limited advancement into the future, so we must also simultaneously do justice to the data we have already acquired.

In a small way we already doing this. Consider citizen science, where public participation in the analysis of archived data increases the possibility of real scientific discovery. It’s a natural evolution, giving those with spare time on their hands the chance to advance scientific knowledge.

However, soon this will not be sufficient. What we need is a new breed of professional astronomy data-miners eager to get their hands dirty with “old” data, with the capacity to exploit more readily the results and findings.

Thus far, human ingenuity, and current technology have ensured that data storage capabilities have kept pace with the massive output of the electronic stargazers. The real struggle is now figuring out how to search and synthesize that output.

The greatest challenges for tackling large astronomical data sets are:

Visualisation of astronomical datasets
Creation and utilisation of efficient algorithms for processing large datasets.
The efficient development of, and interaction with, large databases.
The use of “machine learning” methodologies
The challenges unique to astronomical data are borne out of the characteristics of big data. The three Vs: volume – amount of data, variety – complexity of data and the sources that it is gathered from and velocity – rate of data and information flow. It is a problem that is getting worse.

In 2004, the data I used for my Masters had been acquired in the mid-1990s by the United Kingdom Infra-Red Telescope (UKIRT), Hawaii. In total it amounted a few 10s of Gigabytes.

Moving onward just a matter of months to my PhD, I was studying data taken from one the most successful ground based surveys in the history of astronomy, the Sloan Digital Sky Survey (SDSS). The volume of data I was having to cope with was orders of magnitude more.

SDSS entered routine operations in 2000. At the time of Data Release 12 (DR12) in July 2014 the total volume of that release was 116TB. Even this pales next to the Large Synoptic Survey Telescope (LSST). Planned to enter operation in 2022, it is aiming to gather 30TB a night.

To make progress with this massive data set, astronomy must embrace a new era of data-mining techniques and technologies. These include the application of artificial intelligence, machine learning, statistics, and database systems, to extract information from a data set and transform it into an understandable structure for further use.

Now while many scientists find themselves focused on solving these issues, let’s just pull back a moment and ask the tough questions. For what purpose are we gathering all this new data? What value do we gain from just collecting it? For that matter, have we learned all that we can from the data that we have?

It seems that the original science of data, astronomy, has a lot to learn from the new kid on the block, data science. Think about it. What if, as we strive to acquire and process more photons from across the farther reaches of the universe, from ever more exotic sources with even more complex instrumentation, that somewhere in a dusty server on Earth, the answers are already here, if we would just only pick up that dataset and look at it … possibly for the first time.

Dr Maya Dillon is the community manager for Pivigo. The company supports analytical PhDs making the transition into the world of Data Science and also runs S2DS: Europe’s largest data science boot-camp.

To read the original article on The Guardian, click here.

Source: Big universe, big data, astronomical opportunity by analyticsweekpick

2017 Trends in the Internet of Things

The Internet of Things is lurching forward into the coming year, like never before. Its growth is manifesting rapidly, exponentially, with an increasingly broadening array of use cases and applications influencing verticals well removed from its conventional patronage in the industrial internet.

With advances throughout the public and private sectors, its sway is extending beyond retail and supply chain management to encompass facets of delivery route optimization, financial services, healthcare and the marked expansion of the telecommunication industry in the form of connected cities and connected cars.

An onset of technological approaches, some novel, some refined, will emerge in the coming year to facilitate the analytics and security functionality necessary to solidify the IoT’s impact across the data sphere with a unique blend of big data, cloud, cognitive computing and processing advancements for customized applications of this expressivity of IT.

The result will be a personalization of business opportunities and consumer services veering ever closer to laymen users.

Speed of Thought Analytics
The interminable sensor generation and streaming of data foundational to the IoT warrants a heightened analytic productivity facilitated in a variety of ways. Surmounting the typical schema constraints germane to the relational world can involve semantic technologies with naturally evolving models to accommodate time-sensitive data. Other techniques involve file formats capable of deriving schema on the fly. “Self-describing formats is the umbrella,” MapR Senior Vice President of Data and Applications Jack Norris reflected. “There are different types of files that kind of fall into that, such as JSON and Avro.” Still other approaches involve General Processing Units (GPUs), which have emerged as a preferable alternative to conventional Central Processing Units (CPUs) to enable what Kinetica VP of Global Solution Engineering Eric Mizell referred to as answering questions at “the speed of thought”—in which organizations are not limited by schema and indexing designs for the number, speed, and type of questions provisioned by analytics in real-time.

According to Mizell, GPUs are “purpose-built for repetitive tasks at parallel with thousands of cores for aggregation, mathematics, and those type of things” whereas CPUS are better for discreet, sequential operations. Analytics platforms—particularly those designed for the IoT—leveraging GPUs are not bound by schema and rigid indexing to allow for multiple questions equitable to the speed at which data is generated, especially when augmented with visualization mechanisms illustrating fluctuating data states. “You can ask questions of the data without having to have predetermined questions and find answers in human readable time,” Mizell explained. “We’re getting tremendous response from customers able to load hundreds of millions and billions of rows [and] include them in interactive time. It transforms what business can do.” These capabilities are integral to the expansion of the IoT in the telecommunications industry, as “Connected cities and connected cars are huge with a lot of the telcos,” according to Mizell.

Machine Interaction
The best means of deriving value from the IoT actually transcends analytics and necessitates creating action between connected machines. Effecting such action in real-time will increasingly come to rely on the various forms of artificial intelligence pervading throughout modern enterprises, which is most readily accessible with options for machine learning and deep learning. Furthermore, Forbes contends AI is steadily moving to the cloud, which is instrumental in making these capabilities available to all organizations—not just those armed with a slew of data scientists. Regarding the options for libraries of deep learning and machine learning algorithms available to organizations today, Mizell remarked, “We’re exposing those libraries for consumers to use on analytics and streaming. On the data streaming end we’ll be able to execute those libraries on demand to make decisions in real-time.” The most cogent use case for machine-to-machine interaction involving the IoT pertains to connected cars, autonomous vehicles, and some of the more cutting edge applications for race car drivers. These vehicles are able to account for the requisite action necessary in such time-sensitive applications by leveraging GPU-facilitated AI in real time. “For autonomous cars, the Tesla has a bank of GPUs in the trunk,” Mizell commented. “That’s how it’s able to read the road in real-time.”

Back from the Edge
Another substantial trend to impact the IoT in the coming year is the evolving nature of the cloud as it relates to remote streaming and sensor data applications. Cloud developments in previous years were focused on the need for edge computing. The coming year will likely see a greater emphasis on hybrid models combining the decentralized paradigm with the traditional centralized one. In circumstances in which organizations have real-time, remote data sources on a national scale, “You can’t respond to it fast enough if you’re piping it all the way down to your data center,” Mizell said. “You’ll have a mix of hybrid but the aggregation will come local. The rest will become global.” One of the best use cases for such hybrid cloud models for the IoT comes from the U.S. Postal Service, which Mizell mentioned is utilizing the IoT to track mail carriers, optimize their routes, and increase overall efficiency. This use case is similar to deployments in retail in which route optimization is ascertained for supply chain management and the procurement of resources. Still, the most prominent development affecting the IoT’s cloud developments could be that “all of the cloud vendors are now providing GPUs,” Mizell said. “That’s very new this year. You’ve got all the big three with a bank of GPUs at the ready.” This development is aligned with the broadening AI capabilities found in the cloud.

Software Defined Security
Implementing IoT action and analytics in a secure environment could very well represent the central issue of the viability of this technology to the enterprise. Numerous developments in security are taking place to reduce the number and magnitude of attacks on the IoT. One of the means of protecting both endpoint devices and the centralized networks upon which they are based is to utilize software defined networking, which is enjoying a resurgence of sorts due to IoT security concerns. The core of the software defined networking approach is the intelligent provisioning of resources on demand for the various concerns of a particular network. In some instances this capability includes dedicating resources for bandwidth and trafficking, in others it directly applies to security. In the latter instance the network can create routes for various devices—on-the-fly—to either connect or disconnect devices to centralized frameworks according to security protocols. “Devices are popping up left and right,” Mizell acknowledged. “If it’s an unknown device shut it down. Even if it has a username and a password, don’t give it access.” Some of the applications of the IoT certainly warrant such security measures, including financial industry forays into the realm of digital banking in which mobile devices function as ATM machines allowing users to withdraw funds from their phones and have cash delivered to them. “That’s what they say is in the works,” Mizell observed.

Endpoint Security
Security measures for the IoT are exacerbated by the involvement of endpoint devices, which typically have much less robust security than centralized frameworks do. Moreover, such devices can actually perpetuate attacks in the IoT to wreak havoc on centralized mainframes. Strengthening device security can now take the form of endpoint device registration and authorization. According to Mizell: “There’s a notion of device registration, whether it’s on the network or not. If [you] can bring your phone or whatever device to work, it detects the device by its signature, and then says it only has access to the internet. So you start locking devices into a certain channel.” Blockchain technologies can also prove influential in securing the IoT. These technologies have natural encryption techniques that are useful for this purpose. Moreover, they also utilize a decentralized framework in which the validity of an action or addendum to the blockchain (which could pertain to IoT devices in this case) is determined by effecting a consensus among those involved in it. This decentralized, consensus-based authorization could prove valuable for protecting the IoT from attacks.

As the use cases for the IoT become more and more surprising, it is perhaps reassuring to realize that the technologies enabling them are becoming more dependable. Accessing the cognitive computing capabilities to implement machine-based action and swift analytics via the cloud is within the grasp of most organizations. The plethora of security options can strengthen IoT networks, helping to justify their investments. Hybrid cloud models use the best of both worlds for instantaneous action as well as data aggregation. Thus, the advantages of the continuous connectivity and data generation of this technology are showing significant signs of democratization.

Source: 2017 Trends in the Internet of Things by jelaniharper

Dipping Customer Satisfaction? 5 Lessons from a cab driver

Dipping Customer Satisfaction? 5 Lessons to learn from a cab driverYes, you read it right. Great customer experience comes from anywhere. I want to bring your kind attention to a personal service encounter of a leading customer experience advocate Scott McKain . This encounter is focused ona cab journey by a cab driver “Taxi Terry”. No.. No.. I am not going to bug you with a word-on-word transcript of something you could extract more pleasure from watching the video at the end.
This experience teaches us some fundamental lessons that we all could learn from.

The top 5 lessons are:
1.      Understand your customer well:

In video story, cab driver Taxi Terry is working really hard in connecting with his customers and understanding their story. He not only listening but also recording those stories. It is something we all could learn. In today’s world when competition is at its peak, and companies are sitting on tight budget, it has become an absolute essential to retain existing clients.
What better way to retain existing client but to know their story and connect with them. The more you know about your customers, the better bond it forms, thatis comfortable and trustworthy. This not only leads to long-term, business relationship, but also more referral/word-of mouth opportunities.

2.      Build a system to deliver experience and not just service:
From the statement “Are you ready for the best cab ride of your life” to the point of referring Scott to Taxi Terry’s website for receipt and future bookings; every thing was build not just to deliver a service, but to deliver an experience. The outcome- Look at this blog, the video, hits on this youtube video. People do not remember service, but they do remember experience. So, if an effort is put in place to build a system to deliver an experience, the more customers will take them along for effective word-of-mouth, utlimately resulting in better branding.

Consider watching a movie trailer.  Movie trailers are not made to tell about an upcoming movie, but to deliver an experience, which you take with you when you leave and share it with people around. With surging social media channels, nothing could be better than a satisfied customer willing to share their story to friends and family. So, any effort in delivering experience will ultimately deliver more word-of-mouth and better loyalists.

3.      Keep on minimizing complexities for your customers:
One more thing that stood out in the video was consistent effort by cab driver to make sure customer is met with minimum difficulty. Every inch of effort was placed to make customer feel comfortable and at ease. It resulted in consistent spurts of WOW moments on how easy it is made to deal with something frustrating, dry and mundane. This ultimately helped customer in focusing on other valuable aspects of the service. The more complicated the services, the more they distract customers from a good experience. So, it will be worth every penny to fine tune processes to make them simple and easy to follow. Companies like Amazon, Zappos, and Southwest are pioneers at this.

4.      Always work to create a WOW experience, even if it costs extra:
While delivering some services, how many of us actually focus on delivering a WOW moment to amuse customers? This is another big hole that Taxi Terry addresses. His constant effort to entertain his client has paid off handsomely. Surely, he might have had to spend something extra to integrate the process in his routine services, but it ultimately helped him gain some loyal customers, build a strong customer centric brand. What these small things did for him, wouldn’t have been made possible by big investments in marketing. So, investing in that wow experience should always be considered. It not only helps build a powerful and sustainable brand around customer centricity, but also, help gain loyal customers along the way.
5.      Always use the best tools for the job that helps you excel in customer engagement:
Cab driver with a weather map is surely something no one expected. Even a database with customer history is something one could never relate with him. This openness to latest tools and techniques to deliver a quality experience is something, which we should all learn from this video.

Having better tools to help us deliver a wow experience is always an asset. It not only helps in delivering a better service, but also helps in establishing a competitive edge, which utlimately results in building a stronger and identifiable value preposition.

With these 5 lessons, one could easily create a rounded customer experience approach for delivering a system that is generating continuous and sustainable WOW at the cost of building a customer centric company.

Did it hit you in a different way, would love to know your thoughts. Please leave a comment for any suggestions/criticism.

Certainly would love to give a shout out to Scott McKain for this video.

Now, with no further ado the video:

Source by v1shal

What Is a Residential IP, Data Center Proxy and what are the Differences?

A residential IP could simply mean a connection from an ISP to a residential owner. When you connect to the internet, you connect using an IP address. To know your current IP address, you can use the What Is My IP site. It will display your IP address and your ISP name as well as the country you are connecting the internet from.

An IP address is a set of numbers appearing in a pattern and separated with a full stop such as If you use a residential IP address as your proxy when connecting the internet in your residence, your real IP address will be masked so you will be assigned a different IP address which is called residential IP address.

What is a datacenter proxy?

Unlike a residential IP that is owned by an ISP, a datacenter proxy is not. It acts a shield between you and the web. So anyone spying on what you are doing cannot track you. Your home IP address and all the information related to it is hidden and only the datacenter proxy is displayed together with the details of the datacenter proxy provider. A datacenter proxy can also work as a shield that masks your actual IP address and all your information, however, its performance is not as effective as that of a residential IP.

Difference between a residential IP and a datacenter proxy

Let’s say you are browsing the web from a public Wi-Fi and you need to hide your real IP since most public Wi-Fi connections are not secure which could make no sense to use a residential IP as a proxy.The real essence to use a residential IP address it to ensure that sites don’t know who exactly you are since no information associated with you is made available to those websites you visit.

Residential IP Proxies

Genuine and legitimate: It is easier to create multiple data center proxies but obtaining many residential proxies is difficult since residential IPs are mainly used for residential purposes. This is the reason why residential IPs are considered to be more genuine and legitimate when compared to datacenter proxies.

Costly with few providers: Residential IPs are difficult to obtain so this makes them be more expensive since fewer providers offer them, in fact, obtaining a monthly subscription for hundreds of residential IPs is extremely expensive. However, sometimes the monthly subscription for hundreds of residential IPs could be cheaper when compared with a larger monthly subscription of data center IP proxies.

Residential IPs are sometimes prone to be blacklisted: Although they are genuine and legitimate, they are also likely to be abused.  In such situations, they get blacklisted by some security technologies and databases. Therefore, using a residential proxy connection is good although not perfect.

Datacenter proxies

Less genuine though still protective: Websites have the ability to detect a user who is accessing them via a proxy connection, and since there are many users who are spamming these websites, you could be held accountable when accessing these websites using one. However, what the websites can detect is the datacenter proxy since you real IP address and all information associated with you is shielded.  It is, therefore, good to use fresh data center proxy for different accounts than accessing the web with your real IP for all your account.

Cheaper with more providers: It’s easy to collect datacenter proxies and they are offered by hundreds of providers. This makes them be less expensive; in fact, they cost a fraction of what residential IP proxies could cost you.

Which is best for residential IPs and data center proxy?

This post is not aimed at selling either of the two so you can take it up to yourself to decide which one best suits your needs. However, it is good to be careful when getting advice from a proxy or VPN provider.

Data centers are easy to get and they are less expensive. Using them could cost you a fraction of what residential IPs could cost you, however, if you consider legitimacy, you are better off using residential IPs.


Having learned about the difference between residential IPs and datacenter proxy, it’s your turn to choose which one is suitable for your needs. However, it is good to consider using something that is genuine all time.


Assess Your Data Science Expertise

Data Skills Scoring System
Data Skills Scoring System

What kind of a data scientist are you? Take the free Data Skills Scoring System Survey at

Companies rely on experts who can make sense of their data. Often referred to as data scientists, these people bring their specific skills to bear in helping extract insight from the data. These skills include such things as Hacking, Math & Statistics and Substantive Expertise. In an interesting study published by O’Reilly, Harlan D. Harris, Sean Patrick Murphy and Marck Vaisman surveyed several hundred practitioners, asking them about their proficiency in 22 different data skills. They found that data skills fell into five broad areas: Business, ML / Big Data, Math / OR, Programming and Statistics.

Complementary Data Skills Required

There are three major tasks involved in analytics projects. First, you need to ask the right questions, requiring deep knowledge of your domain of interest, whether that be for-profit business, non-profits or healthcare organizations. When you know your domain area well, you are better equipped to know what questions to ask to get the most value from your data. Second, you need access to the data to help you answer those questions. These data might be housed in multiple data sources, requiring a data worker with programming skills to access and intelligently integrate data silos. Finally, you need somebody to make sense of the data to answer the questions proposed earlier. This step requires data workers who are more statistically-minded and can apply the right analytics to the data. Answering these questions could be more exploratory or intentional in nature, requiring different types of statistical and mathematical approaches.

Getting value from data is no simple task, often requiring data experts with complementary skills. After all, I know of nobody who possesses all the data skills to successfully tackle data problems. No wonder why data science has been referred to as a team sport.

Data Skills Scoring System (DS3)

We at AnalyticsWeek have developed the Data Skills Scoring System (DS3), a free web-based self-assessment survey that measures proficiency across five broad data science skills: business, technology, math and modeling, programming and statistics. Our hope is that the DS3 can optimize the value of data by improving how data professionals work together. If you are a data professional, the DS3 can help you:

  1. identify your analytics strengths
  2. understand where to improve your analytics skill set
  3. identify team members who complement your skills
  4. capitalize on job postings that match your skill set

While the publicly available DS3 is best suited for individual data professionals, we are customizing the DS3 for enterprises to help them optimize the value of their data science teams. By integrating DS3 scores with other data sources, enterprises will be able to improve how they acquire, retain and manage data professionals.

Find out your data skills score by taking the free Data Skills Scoring System Survey:

We are also conducting research using the DS3 that will advance our understanding of the emerging field of data science. Some questions we would like to answer are:

  • Do certain data skills cluster together?
  • Are some data skills more important than others in determining project success?
  • Are data science teams with comprehensive data skills more satisfied with their work than data science teams where some skills are lacking?

Respondents will receive a free executive summary of our findings.

Source by bobehayes

Is Big Data The Most Hyped Technology Ever?

I read an article today on the topic of Big Data. In the article, the author claims that the term Big Data is the most hyped technology ever, even compared to such things as cloud computing and Y2K. I thought this was a bold claim and one that is testable. Using Google Trends, I looked at the popularity of three IT terms to understand the relative hype of each (as measured by number of searches on the topic): Web 2.0, cloud computing and big data. The chart from Google Trends appears below.

We can learn a couple of things from this graph. First, the interest in Big Data continues to grow since its first measurable growth appeared in early 2011. Still, the number of searches for the respective terms clearly shows that Web 2.0 and cloud computing received more searches than Big Data. While we don’t know if interest in Big Data will continue to grow, Google Trends, in fact, predicts very a very slow growth rate for Big Data through the end of 2015.

Second, the growth rates of Web 2.0 and cloud computing are faster compared to the growth rate of Big Data, showing that public interest grew more quickly for those terms than for Big Data. Interest in Web 2.0 reached its maximum in a little over 2 years since its initial ascent. Interest in cloud computing reached its peak in about 3.5 years. Interest in Big Data has been growing steadily for over 3.7 years.

One thing of interest. For these three technology terms, the growth of the two latter technology terms started at the peak of the previous term. As one technology becomes commonplace, another takes its place.

So, is Big Data the most hyped technology ever? No.

Source: Is Big Data The Most Hyped Technology Ever? by bobehayes

How to be Data-Driven when Data Economics are Broken

The day an IBM scientist invented the relational database in 1970 completely changed the nature of how we use data. For the first time, data became readily accessible to business users.  Businesses began to unlock the power of data to make decisions and increase growth. Fast-forward 48 years to 2018, and all the leading companies have one thing in common: they are intensely data-driven.

The world has woken up to the fact that data has the power to transform everything that we do in every industry from finance to retail to healthcare– if we use it the right way. And businesses that win are maximizing their data to create better customer experiences, improve logistics, and derive valuable business intelligence for future decision-making. But right now, we are at a critical inflection point. Data is doubling each year, and the amount of data available for use in the next 48 years is going to take us to dramatically different places than the world’s ever seen.

Let’s explore the confluence of events that have brought us to this turning point, and how your enterprise can harness all this innovation – at a reasonable cost.

Today’s Data-driven Landscape

We are currently experiencing a “perfect storm” of data. The incredibly low cost of sensors, ubiquitous networking, cheap processing in the Cloud, and dynamic computing resources are not only increasing the volume of data, but the enterprise imperative to do something with it. We can do things in real-time and the number of self-service practitioners is tripling annually. The emergence of machine learning and cognitive computing has blown up the data possibilities to completely new levels.

Machine learning and cognitive computing allows us to deal with data at an unprecedented scale and find correlations that no amount of brain power could conceive.  Knowing we can use data in a completely transformative way makes the possibilities seem limitless.  Theoretically, we should all be data-driven enterprises. Realistically, however, there are some roadblocks that make it seem difficult to take advantage of the power of data:

Trapped in the Legacy Cycle with a Flat Budget

 The “perfect storm” of data is driving a set of requirements that is dramatically outstripping what most IT shops can do. Budgets are flat —increasing only 4.5% annually — leaving companies to feel locked into a set of technology choices and vendors. In other words, they’re stuck in the “legacy cycle”.  Many IT teams are still spending most of budget just trying to keep the lights on. The remaining budget is spent trying to modernize and innovate, and then a few years later, all that new modern stuff that you brought is legacy all over again, and the cycle repeats. That’s the cycle of pain that we’ve all lived through for the last 20 years.

Lack of Data Quality and Accessibility

Most enterprise data is bad. Incorrect, inconsistent, inaccessible…these factors hold enterprises back from extracting the value from data. In a Harvard Business Review study, only 3% of the data surveyed was found to be of “acceptable” quality. That is why data analysts are spending 80% of their time preparing data as opposed to doing the analytics that we’re paying them for. If we can’t ensure data quality, let alone access the data we need, how will we ever realize its value?

Increasing Threats to Data

The immense power of data also increases the threat of its exploitation. Hacking and security breaches are on the rise; the global cost of cybercrime fallout is expected to reach $6 trillion by 2021, double the $3 trillion cost in 2015. In light of the growing threat, the number of security and privacy regulations are multiplying.  Given the issues with data integrity, organizations want to know: Is my data both correct and secure? How can data security be ensured in the middle of this data revolution?

Vendor Competition is Intense

The entire software industry is being reinvented from the ground up and all are in a race to the cloud. Your enterprise should be prepared to take full advantage of these innovations and choose vendors most prepared to liberate your data, not just today, but tomorrow, and the year after that.

Meet the Data Disruptors

It might seem impossible to harness all this innovation at a reasonable cost. Yet, there are companies that are thriving amid this data-driven transformation. Their secret? They have discovered a completely disruptive way, a fundamentally new economic way, to embrace this change.

We are talking about the data disruptors – and their strategy is not as radical as it sounds. These are the ones who have found a way to put more data to work with the same budget. For the data disruptors, success doesn’t come from investing more budget in the legacy architecture. These disruptors take an approach with a modern data architecture that allows them to liberate their data from the underlying infrastructure.

Put More of Your Data to Work

The organizations that can quickly put right data to work will have a competitive advantage. Modern technologies make it possible to liberate your data and thrive in today’s hybrid, multi-cloud, real-time, machine learning world.  Here are three prime examples of innovations that you need to know about:

  • Cloud Computing: The cloud has created new efficiencies and cost savings that organizations never dreamed would be possible. Cloud storage is remote and fluctuates to deliver only the capacity that is needed. It eliminates the time and expense of maintaining on-premise servers, and gives business users real-time self-service to data, anytime, anywhere. There is no hand-coding required, so business users can create integrations between any SaaS and on-premise application in the cloud without requiring IT help. Cloud offers cost, capability and productivity gains that on-premise can’t compete with, and the data disruptors have already entrusted their exploding data volumes to the cloud.
  • Containers: Containers are quickly overtaking virtual machines. According to a recent study, the adoption of application containers will grow by 40% annually through 2020. Virtual machines require costly overhead and time-consuming maintenance, with full hardware and operating system (OS) that needs managed. Containers are portable with few moving parts and minimal maintenance required. A company using stacked container layers pays only for a small slice of the OS and hardware on which the containers are stacked, giving data disruptors unlimited operating potential, at a huge cost savings.
  • Serverless Computing: Deploying and managing big data technologies can be complicated, costly and requires expertise that is hard to find. Research by Gartner states, “Serverless platform-as-a-service (PaaS) can improve the efficiency and agility of cloud services, reduce the cost of entry to the cloud for beginners, and accelerate the pace of organizations’ modernization of IT.”

Serverless computing allows users to run code without provisioning or managing any underlying system or application infrastructure. Instead, the systems automatically scale to support increasing or decreasing workloads on-demand as data becomes available.

Its name is a misnomer; serverless computing still requires servers, but the cost is only for the actual server capacity used; companies are only charged for what they are running at any given time, eliminating the waste associated with on-premise servers.  It scales up as much as it needs to solve that problem, runs it, and scales it back down, turns off. The future is serverless, and its potential to liberate your data is limitless.

Join the Data Disruptors

Now is the time to break free from the legacy trap and liberate your data so its potential can be maximized by your business. In the face of growing data volumes, the data disruptors have realized the potential of the latest cloud-based technologies. Their business and IT teams can work together in a collaborative way, finding an end-to-end solution to the problem, all in a secure and compliant fashion. Harness this innovation and create a completely disruptive set of data economics so your organization can efficiently surf the tidal wave of data.


The post How to be Data-Driven when Data Economics are Broken appeared first on Talend Real-Time Open Source Data Integration Software.