The mining and construction equipment maker wants a piece of the industrial Internet. Its strategy? Turn to the startup world for help.
General Electric isnât the only industrial giant attempting to jumpstart its business with data and software services. On Thursday morning Caterpillar announced it has made a minority investment in Uptake, a Chicago-based data analytics platform co-founded by long-time entrepreneur Brad Keywell, who is profiled in the current issue of Fortune.
As part of the agreement, Caterpillar and Uptake will co-develop âpredictive diagnosticsâ tools for the larger companyâs customers. (Uptake says it is also working with other players in insurance, automotive and healthcare, though it wonât disclose other names.) The idea? To take the mountains of data spewing from bulldozers and hydraulic shovels and turn it into meaningful information that can help Caterpillarâs customers catch potential maintenance issues before breakdowns occur, minimizing downtime.
âWe had some experience in this [data analytics] because of our our autonomous mining equipment,â Doug Oberhelman, Caterpillarâs CEO, said in an interview with Fortune last month. âBut we were really looking for somebody to help us jumpstart this. And thatâs where the lightbulb went on between Brad and I.â
Oberhelmanâs company, based in Peoria, Ill., is a 90-year-old manufacturer whose performance is often seen as a gauge of the health of the global economy. Uptake, meanwhile, is the brainchild of Keywell, a serial entrepreneur best known for co-founding daily-deal site Groupon. But the two CEOs bonded at a Chicago-area breakfast hosted byâof all peopleâBritish prime minister David Cameron.
âThis is an early example of something that will become commonplace at some pointâentrepreneurs trying to disrupt within an industry rather than disrupt an industry from the outside,â Keywell told Fortune.
Of course, disrupting GEâs $1 billion head start in data analytics (which includes a massive software center the company has built in the Bay Area) wonât be easy. CEO Jeff Immelt has made it clear that the so-called âindustrial Internetâ will bring about the next wave of (much-needed) growth for his company, and the company has been hard at work developing applications based on its Predix platform and lining up customers like United Airlines and BP.
Uptake, located in the former Montgomery Ward building in Chicago (where many of Keywellâs other startups are also headquartered) has about 100 employees, including a handful of data scientists from nearby universities. It is a speck compared to GE, the 27th largest company in the world. But itâs ability to snag Caterpillar as both an investor and a customer wonât go unnoticed: Not everyone, especially competitors, will want to buy software from GE.
Caterpillar isnât disclosing how much money it is putting into Uptake, but the two companies have already been working closely for several months (Keywellâs venture capital fund, Lightbank, has also invested in the startup). The ROI for Caterpillar could be far-reaching. While it takes three to five years to design and build a new bulldozer, Uptakeâs product development is measured in days and weeks. In an industry that will increasingly have to rely on software services for growth, incorporating some of the startup worldâs DNAâspeed and agilityâis a smart bet for Caterpillar.
Is your refrigerator running a spam operation? Thanks to the Internet of Things, the answer to that question could be yes.
Despite some dystopian fears, like that spamming refrigerator, the Internet of Things isnât just an eerie term that sounds like it was plucked from Brave New World. It is a vague one though, so to clear up any uncertainty, hereâs the dictionary definition: âa proposed development of the Internet in which everyday objects have network connectivity, allowing them to send and receive data.â
As Altimeter Group points out in its new report, âCustomer Experience in the Internet of Things,â brands are already using this sci-fi technology in amazing ways to build customer relationships and optimize their products.Â In reality, itâs moreÂ evolution than revolution, asÂ companies are already tracking smartphone and Internet usageÂ to gather data that provides crucial feedback about consumer behavior. As the report states, the Internet of Things only âbrings us closer than ever to the ultimate marketing objective: delivering the right content or experience in the right context.â
Talk of trackers and sensors and refrigerators gone wild may sound intimidating for brands that are still getting their content operations up and running, but some major companies are already exploring the new frontier of the Internet of Things. Here are the five brands doing it best.
Have you ever found yourself searching for a specific item in a pharmacy, wishing you could click control-F to locate it, pay, and leave quickly? Aisle411, Google Tango, and Walgreens teamed up to create a new mobile app that can grant harried shoppers that wish. By using Googleâs virtual indoor 3D mapping technology, Aisle411 created a mobile shopping platform that lets consumers search and map products in the store, take advantage of personalized offers, and easily collect loyalty points.
âThis changes the definition of in-store advertising in two key ways,â Aisle411 CEO Nathan Pettyjohn toldÂ Mobile Commerce Daily. âAdvertising becomes an experienceâimagine children in a toy store having their favorite toy guide them through the store on a treasure hunt in the aisles of the storeâand the end-cap is everywhere; every inch of the store is now a digital end cap.â
According to a Forrester study, 19 percent of consumers are already using their mobile devices to browse in stores. Instead of forcing consumers to look away from their screens, Walgreens is meeting them there.
2. Taco Bell
Nowadays, practically everyone is reliant on their GPS to get them places.Â Thatâs why Taco Bell is targeting consumers based on location by advertising and messaging them on mobile platforms like Pandora, Waze (a navigation app purchased by Google), and weather apps.
Digiday reports that in 2014, Taco Bell positioned ads on Waze for their 12 Pack product each Saturday morning to target drivers who mightâve been on their way to watch college football games. The Waze integration was so successful that Taco Bell decided to do the same thing on Sundays during the NFL seasonâthis time advertising its Cool Ranch Doritos Locos Taco.
3. Home Depot
Home Depot has previously used augmented reality in its mobile app to allow users to see how certain products would look in their homes. IKEA is also known for enticing consumers with this mobile strategy. But now, Home Depot is making life even easier for shoppers by piloting a program that connects a customerâs online shopping carts and wish lists with an in-store mobile app.
As explained in the Altimeter report, upon entering a Home Depot, customers who are part of the Pro Rewards program will be able to view the most efficient route through the store based on the products they shopped for online. And anyone whoâs been inside a Home Depot knows how massive and overwhelming those places can be without directions.
Creepy? Maybe. But helpful? Definitely. Michael Hibbison, VP of marketing and social media at Home Depot, defends the program to Altimeter Group: âLoyalty programs give brands more rope when it comes to balancing risks of creep. The way we think of it is we will be as personalized as you are loyal.â
4. Tesla Motors
Getting your car fixed can be as easy as installing a software update on your phoneâat least for Tesla customers. Teslaâs cars are electric, powered by batteries similar to those that fuel your laptop and mobile device. So when Tesla had to recall almost 30,000 Model S cars because their wall chargers were overheating, the company was able to do the ultimate form of damage control. Instead of taking the products back or bothering customers to take them to a dealership, Tesla just updated the software of each car, effectively eliminating the problem in all of their products.
Tesla also used this connectedness by crowdsourcing updated improvements for their products. As reported by Altimeter, a customer recently submitted a request for a crawl feature that allows the driver to ease into a slow cruise control in heavy traffic. Tesla not only granted the customerâs request, but they added the feature to their entire fleet of cars with just one software update.
McDonaldâs may be keeping it old school with their Monopoly contest, which, after 22 years, can still be won by peeling back stickers on your fries and McNuggets. But for their other marketing projects, McDonaldâs is getting pretty tech savvy.
McDonaldâs partnered with Piper, a Bluetooth low-energy beacon solution provider, to greet customers on their phones as they enter the restaurant. Through the app, consumers are offered coupons, surveys, Q&As, and even information about employment opportunities.
What does McDonaldâs get out of it? Data. Lots of data. When customers enter comments, their feedback is routed to the appropriate manager who can respond to the request before the person leaves the establishment.
Too close for comfort? Not compared to the companyâs controversial pay-with-a-hug stunt. And at least this initiative is working. According to Mobile Commerce Daily, in the first month of the appâs launch McDonaldâs garnered more than 18,000 offer redemptions, and McChicken sales increased 8 percent.
By tapping into the Internet of Things, brands can closely monitor consumer behavior, andâeven though it may sound a bit too invasiveâput the data they collect to good use. With sensors, a product can go from being a tool to an actual medium of communication between the marketer and the consumer. That sounds pretty cool. But, just to be safe, if you get a shady email from your fridge, maybe donât open it.
Â To read the original article on The Constant Strategist, click here.
Customer feedback professionals are asked to demonstrate the value of their customer feedback programs. They are asked: Does the customer feedback program measure attitudes that are related to real customer behavior? How do we set operational goals to ensure we maximize customer satisfaction? Are the customer feedback metrics predictive of our future financial performance and business growth? Do customers who report higher loyalty spend more than customers who report lower levels of loyalty? To answer these questions, companies look to a process called business linkage analysis.
Business Linkage Analysis is the process of combining different sources of data (e.g., customer, employee, partner, financial, and operational) toÂ uncover important relationships among important variables (e.g., call handle time and customer satisfaction). For our context, linkage analysis will refer to the linking of other data sources to customer feedback metrics (e.g., customer satisfaction, customer loyalty).
Business Case for Linkage Analyses
Based on a recent study on customer feedback programs best practices (Hayes, 2009), I found that companies who regularly conduct operational linkages analyses with their customer feedback data had higher customer loyalty (72nd percentile) compared to companies who do conduct linkage analyses (50th percentile). Furthermore,Â customer feedback executives were substantially more satisfied with their customer feedback program in helping them manage customer relationships when linkage analyses (e.g., operational, financial, constituency) were a part of the program (~90% satisfied)Â compared to their peers in companies who did not use linkage analyses (~55% satisfied). Figure 1 presents the effect size for VOC operational linkage analyses.
Linkage analyses appears to have a positive impact on customer loyalty by providing executives the insights they need to manage customer relationships. These insights give loyalty leaders an advantage over loyalty laggards.Â Loyalty leaders apply linkage analyses results in a variety of ways to build a more customer-centric company: Determine the ROI of different improvement effort, create customer-centric operational metrics (important to customers) and set employee training standards to ensure customer loyalty, to name a few. In upcoming posts, I will present specific examples of linkage analyses using customer feedback data.
Linkage Analysis: A Data Management and Analysis Problem
You can think of linkage analysis as a two-step process: 1 ) organizing two disparate data sources into one coherent dataset and 2) conducting analyses on that aggregated dataset.Â The primary hurdle in any linkage analysis is organizing the data in an appropriate way where the resulting linked dataset make logical sense for our analyses (appropriate unit of analysis). Therefore, data management and statistical skills are essential in conducting a linkage analysis study. More on that later.
Once the data are organized, the researcher is able to conduct nearly any kind of statistical analyses he/she want (e.g., Regression, ANOVA, Multivariate), as long as it makes sense given the types of variables (e.g., nominal, interval) you are using.
Types of Linkage Analyses
In business, linkage analyses are conducted using the following types of dataÂ (see Figure 2):
Even though I discuss these data sources as if they are distinct, separate sources of data, it is important to note that some companies have some of these data sources housed in one dataset (e.g., call center system can house transaction details including operational metrics and customer satisfaction with that transaction). While this is an advantage, these companies still need to ensure their data are organized together in an appropriate way.
With these data sources, we can conduct three general types of linkage analyses:
Financial: linking customer feedback to financial metrics
Operational: linking customer feedback to operational metrics
Constituency: linking customer feedback to employee and partner variables
Before we go further, I need to make an important distinction between two different types of customer feedback sources: 1) relationship-based and 2) transaction-based. In relationship-based feedback, customer ratings (data) reflect their overall experience with and loyalty towards the company. In transaction-based feedback, customer ratings (data) reflect their experience with a specific event or transaction. This distinction is necessary because different types of linkage analyses require different types of customer feedback data (See Figure 3). Relationship-based customer feedback is needed to conduct financial linkage analyses and transaction-based customer feedback is needed to conduct operational linkage analyses.
The term “linkage analysis” is actually a misnomer. Linkage analysis is not really a type of analysis; it is used to denote that two different data sources have been “linked” together. In fact, several types of analyses can be employed after two data sources have been linked together. Three general types of analyses that I use in linkage analyses are:
Factor analysis of the customer survey items: This analysis helps us create indices from the customer surveys. These indices will be used in the analyses. These indices, because they are made up of several survey questions, are more reliable than any single survey question. Therefore, if there is a real relationship between customer attitudes and financial performance, the chances of finding this relationship greatly improves when we use metrics rather than single items.
Correlational analysis (e.g., Pearson correlations, regression analysis): This class of analyses helps us identify the linear relationship between customer satisfaction/loyalty metrics and other business metrics.
Analysis of Variance (ANOVA): This type of analysis helps us identify the potentially non-linear relationships between the customer satisfaction/loyalty metrics and other business metrics. For example, it is possible that increases in customer satisfaction/loyalty will not translate into improved business metrics until customer satisfaction/loyalty reaches a critical level. When ANOVA is used, the independent variables in the model (x) will be the customer satisfaction/loyalty metrics and the dependent variables will be the financial business metrics (y).
Business linkage analysis is the process ofÂ combining different sources of data toÂ uncover important insights about the causes and consequence of customer satisfaction and loyalty. For VOC programs, linkage analyses fall into three general types: financial, operational, and constituency. Each of these types of linkage analyses provide useful insight that can help senior executives better manage customer relationships and improve business growth. I will provide examples of each type of linkage analyses in following posts.
At their previous jobs at venture capital firms, Sequoia Capital and Accel Partners, respectively, Neha Singh and Abhishek Goyal often had to helpÂ identifyÂ prospective startups and make investment decisions.
ButÂ it wasnât always easy.
Startups usually donât disclose information about themselves, since they are privately held firms and are under no compulsion to share data publicly. So, Singh and GoyalÂ had to constantly struggle to collate information from multiple sources.
Eventually, fed up with the lack of a single source for data, the Indian Institute of Technology graduates quit their jobs in 2013 to start an analytics firm, Tracxn!. Their ambition: To become the Gartnerâthe go-to firm for information technology researchâof the startup ecosystem.
âItâs almost surprising,â Singh told Quartz in an email interview, âthat despite billions of dollars invested in each of the sectors (be in foodtechÂ or mobile commerce, or payments, etc), thousands of people employed in this ecosystem and many more aspiring to start something here, there is not a single source which tracks and provides insights into these private markets.â
Tracxn! started operations in MayÂ 2013, working from Lightspeed Venture Partnersâ office in Menlo Park, California, with angel funding from founders of e-commerce companies like Flipkart and Delhivery. In 2014, the startup began its emerging markets operation with focus onÂ India and China.
âAfter our first launch in April last year, we scaled the revenues quickly and turned profitable last September, (and) grewÂ to a team of 40,â Singh said. Most of its analysts are based in Bengaluru.
Tracxn! follows a SaaS (software as a service) business model, charging subscribers between $20,000 and $90,000 per year. With a database ofover 7,000 Indian and 21,000 US startups, Singh and Goyal now count over 50 venture capital funds among their clients, which also include mergers and acquisitions specialists, product managers, founders and aspiring entrepreneurs.
While firms like Mattermark, Datafox and CB Insights provide similar services,Â Tracxn! allowsÂ investorsÂ to get an overview of a sector within the ecosystem before drilling down to individual companies.
âFor many funds, we have become a primary source of their deal discovery,â said Singh. âWe want to become the default research platform for anyone looking for information and trends on these private markets and companies.â
In April this year, Tracxn! received $3.5 million in funding from private equity firm, SAIF Partners,Â which it plans to use toÂ ramp up its analyst strength to 150 by the end of the year.
âWe keep getting inquiries from investors across various countries (like from Europe, parts of Southeast Asia, etc),â explained Singh. âBut we cannot launch them because we donât have analyst teams for it.â
But with money on its way, Tracxn! now wants toÂ expand coverageÂ into Malaysia, Indonesia, Singapore, Philippines, Vietnam and Europe to build its global database.
Originally posted at: http://qz.com/401931/meet-the-startup-that-is-obsessed-with-tracking-every-other-startup-in-the-world/
I recently read a good article on the difference betweenÂ structured and unstructured data. The author defines structured data as data that can be easily organized. As a result these type of data are easily analyzable. Unstructured data refers to information that either does not have a pre-defined data model and/or is not organized in a predefined manner. Unstructured data are not easy to analyze. A primary goal of a data scientist isÂ to extract structure from unstructured data. Natural language processing is a process of extracting something useful (e.g., sentiment, topics) from something that is essentially useless (e.g., text).
While I like theseÂ definitions she offers, she included anÂ infographic that is confusing. It equates the structural nature of the data with the source of the data, suggesting that structured data areÂ generated solely from internal/enterprise systems while unstructured data are generated solely from social media sources. I think it would be useful to separate the formatÂ (structure vs. unstructured)Â of the data from source (internal vs. external) of data.
Sources of Data: Internal and External
Generally speaking, business data can come from either internal sources or from external sources. Internal sources of data reflect those data that are under the control of the business. These data are housed in financial reporting system, operational systems, HR systems and CRM systems, to name a few.Â Business leaders have a large say in the quality of internal data; they are essentially a byproduct of the processes and systems the leaders useÂ to run the business and generate/store the data.
External sources of data, on the other hand, are any data generated outside the walls of the business. These data sources includeÂ social media, online communities, open data sources and more.Â Due to the nature of source of data, external sources of data are under less control by the business than areÂ internal sources of data. These data are collected by other companies, each using their unique systems and processes.
Data Definition Framework
This 2×2 data framework is a way to think about your business data (See Figure 1). This model distinguishesÂ the formatÂ of data from the source of data. The 2 columns represent the format of the data, either structured or unstructured. The 2 rows represent the source of the data, either internal or external. Data can fall into one of the four quadrants.
Using this framework, we see thatÂ unstructured data can come from both internal sources (e.g., open-ended survey questions, call center transcripts) and external sources (e.g., Twitter comments, Pinterest images). Unstructured data is primarily human-generated.Â Human-generated data are those that are input by people.
Structured data also can come from both inside (e.g., survey ratings, Web logs, process control measures) and outside (e.g., GPS for tweets, Yelp ratings)Â the business. Structured data includesÂ both human-generated and machine-generated data. Machine-generated data are those that are calculated/collected automatically and without human intervention (e.g., metadata).
The quality of any analysis is dependent on the quality of the data. You are more likely to uncoverÂ something useful in your analysis if your data are reliable and valid. When measuring customers’ attitudes, we can use customerÂ ratings orÂ customer comments as our data source. Customer satisfaction ratings, due to the nature of the data (structured / internal), might be more reliable and valid than customer sentiment metrics from social media content (unstructured / external); as a result, the use of structured data might lead to a better understanding of your data.
Data format is not the same as data source. I offer this data framework as a way for businesses to organize and understand their data assets. Identify strengths and gaps in your own data collection efforts. Organize your data to help you assess yourÂ Big Data analytic needs. Understanding the data you have is a good first step in knowing what you can do withÂ it.
The fear of the unknown grips all when adopting anything new and it is therefore natural that there are more skeptics when it comes to Cloud computing, which is a new technology that not everybody understands. The lack of understanding creates fear that makes people worry without reason before they take the first step in adapting the latest technology.Â The pattern has been evident during the introduction and launch of any new technology and the advent of Cloud is no exception. It is therefore not a surprise that when it comes to Cloud computing, the likely stakeholders comprising of IT professionals and business owners are wary about the technology and often suspicious about its security level.
Despite wide-scale adoption, more than 90% enterprises in the United States use the Cloud, and there are mixed feelings about Cloud security among companies.Â Interestingly, it is not the enterprise alone that uses the Cloud services NJ because it attracts a large section of small and medium businesses too, with 52% SMBs utilizing the platform for storage. The numbers indicate that the users have been able to overcome the initial fear and now trying to figure out what the new technology is. There is a feeling that the Cloud security is inferior to the security offered by legacy systems and in this article, we will try to understand why the Cloud is so useful and why there should not be concerns about the security.
The perception of Cloud security
The debate rages around whether the Cloud is much more secure or somewhat more secure than legacy systemsÂ It has been revealed in a survey thatÂ 34% IT professionals feel that the Cloud is slightly more secure but not as much secure that would give them the confidence to rank it a few notches above the legacy systems. The opinion stems from the fact that there have been some high profile data breaches in the Cloud at Apple iCloud, Home Depot, and Target but the breaches resulted not from shortcomings of the Cloud security but due to human factors. Misinformation and lack of knowledge are reasons for making people skeptical about Cloud security.
Strong physical barriers and close surveillance
There used to be a time when legacy systems security was not an issue because denying access to on-premise computers was good enough to thwart hackers and other intrusions. However, it can be difficult to implement proper security in legacy systems comprising of the workstation, terminal, and browser that make it unreliable.Â Businesses are now combining legacy systems with the Cloud infrastructure together with the backup and recovery services thus making it more vulnerable to security threats from hackers. Moreover, it is not easy to assess the security of legacy systems that entail a multi-step process that tends to indicate that replacing the legacy system is a better option.
While a locked door is the only defense in most offices to protect the computer system, Cloud service providers have robust arrangements for physical security of data centers comprising of barbed wire, high fences, concrete barriers, security cameras and guards for patrolling the area. Besides preventing people from entering the data center, it also monitors activities in the adjoining spaces.
Access is controlled
The threat is not only from online attackers that try to breach the system, but the threat also comes from people gaining easy physical access to the system that could make it more vulnerable. Cloud service providers ensure complete data security through data encryption during storage, organizations are now turning to selective data storage by using the Cloud facility for storing sensitive data offsite and keep it inaccessible from unauthorized persons. It reduces the human risk of causing damage since only the authorized users get access to sensitive data that remains securely stored in the Cloud. No employee, vendors or third parties can access the data by breaching the security cordon.
Cloud service providers are well aware of the security concerns and adopt robust security measures to ensure that once data reaches the data centers, it remains wholly protected. The Cloud is under close monitoring and surveillance round the clock that gives users more confidence about data security. When using the Cloud services, you not only get access to the top class data center that offers flexibility and security but also you receive the support of qualified experts who help to make better use of the resource for your business.
Auditing security system
To ensure flawless security to its clients, Cloud service providers conduct frequent auditing of the security features to identify possible weaknesses and take measures to eradicate it. Although the yearly audit is the norm, the interim audit may also take place if the need arises.
As the number of Cloud service users keep increasing, it adequately quells the security fears.
Master Data Management (MDM) is the process of establishing and implementing standards, policies and tools for data that’s most important to an enterprise, including but not limited to information on customers, employees, products and suppliers.
In business master data management (MDM) comprises the processes, governance, policies, standards and tools that consistently defines and manages the critical data of an organization to provide a single point of reference.
The data that is mastered may include:
master data – the business objects for transactions, and the dimensions for analysis
reference data – the set of permissible values to be used by other data fields
Transactional data – supports applications
Analytical data – supports decision making 
In computing, An MDM tool can be used to support master data management by removing duplicates, standardizing data (mass maintaining), incorporating rules to eliminate incorrect data from entering the system in order to create an authoritative source of master data. Master data are the products, accounts and parties for which the business transactions are completed. The root cause problem stems from business unit and product line segmentation, in which the same customer will be serviced by different product lines, with redundant data being entered about the customer (aka party in the role of customer) and account in order to process the transaction. The redundancy of party and account data is compounded in the front to back office life cycle, where the authoritative single source for the party, account and product data is needed but is often once again redundantly entered or augmented.
So, with task such important Master Data must be designed appropriately and after careful consideration to variour bells and whistles which are responsible for success and failure of the project. Following are top 21 bestpractices that needs to be considered before applying a good data management strategy.
1. Define “What is the business problem we’re trying to solve?”:
With so much data and so many disperate data sources, it is very easy to get lost in translation. So, a mental road map on the overall objective will help in keeping the effort streamlined.
2. Understand how the project helps to prep you for big data:
Yes, growing data is a concern and it should be sorted out at the planning stage. It is important to identify how master data management strategy will prepare your organization not only for generic enterprise data but to cope up with ever increasing big data.
3. Devise a good IT strategy:
Good IT strategy always go hand in hand with a good data strategy. A disfucntional IT strategy could really throw off a most efficient designed data management strategy. A good IT strategy increase the chances of success for a good MDM strategy by several degrees.
4. Business “users” must take full ownership of the master data initiative:
It’s important that business and it’s users must take full ownership of the inititaitve. A well defined ownership will save project from several communication failure which is almost everytime responsible for any project failure.
5. Allow ample time for evaluation and planning:
A well laid out planning stage ensures all the cracks and crevices are sorted out before project is rolled out. A rushed project often increases the rist of failure. Don’t underestimate the time and expertise needed to develop foundational data models.
6. Understand your MDM hubâs data model and how it integrates with your internal source systems and external content providers:
When data model problems cropped up relatively late in the project, whether it was a disconnect between the hub and an important source system, or a misalignment between data modeled in the hub and an external information provider, it was very disruptive. These problems can be avoided by really understanding how the hub is designed, and then mapping that back to your source systems and your external information sources.
7. Identify the project’s mission and business values:
This is another important area that needs it’s due attention. A clear project mission and business value definition helps in making sure high ROI is thought for and planned after in the process. One must link the initiatives to actionable insights.
8. Choose the best technology platform:
Choosing a good technology is important as well. Remeber, you don’t change your technology daily, so putting some thought and research into it makes a lot of different in sustainability of the project. A good technology should help organization grow to next several years without presenting too much growth bottlenecks.
9. Be real and plan a multi-domain design:
In a real world, many MDM technologies grew up managing one particular type of master data. A good strategy must be consistent across. So, applying the same approach to the various master data domains, whether those be customer, product, asset, supplier, location or person is a good strategy.
10. Active, involved executive sponsorship:
Most organizations are very comfortable with their âislands of dataâ and with technology being implemented in silos. For someone in the organization to come along and suggest changing that status quo, and to start managing critical information centrally, treating it as a true corporate asset, is going to mean some serious cultural change.
11. Use a holistic approach â people, process, technology and information:
This may be the most important best practice. Youâve got to start with the people, the politics, the culture, and then to make sure you spend at least as much time on the business processes involved in data governance and data stewardship. These really deserve a separate article of their own.
12. Pay attention to organizational governance:
You must have a very strong governance model that addresses issues such as change management and knowledge transfer. Afterall, the culture in an organization is a most important entity and a sorted plan to derisk project from it ensures success.
13. Build your processes to be ongoing and repeatable, supporting continuous improvement:
Data governance is a long term proposition. As a reality of any enterprise life, as long as one is in business, enterprise will be creating, modifying, and using master data. So if everyone in the company relies on them, but no one is specifically accountable for maintaining and certifying their level of quality, it shouldnât be a surprise that, over time, like everything else, they become more and more chaotic and unusable. So plan from the beginning for a âway of lifeâ, not a project.
14. Have a big vision, but take small steps:
Consider the ultimate goal, but limit the scope of the initial deployment, users told Ventana. Once master data management is working in one place, extend it step by step, they advised. Business processes, rather than technology, are often the mitigating factor, they said, so it’s important to get end-user input early in the process.
15. Consider potential performance problems:
Performance is the 800-pound gorilla quietly lurking in the master data management discussion, Loshin cautioned. Different architectures can mean different performance penalties. So, make some room for repair.
16. Management needs to recognize the importance of a dedicated team of data stewards:
Just as books belong in a library and a library needs librarians, master data belongs in a dedicated repository of some type, and that repository needs to be managed by data stewards. It is cruicial to start with convincing management of the need for a small team of data stewards who are 100% dedicated to managing the enterpriseâs master data.
17. Consider the transition plan:
Then, there’s the prospect of rolling out a program that has an impact on many critical processes and systems — no trivial concern. Loshin recommended that companies should plan an master data management transition strategy that allows for static and dynamic data synchronization.
18. Resist the urge to customize:
Now that commercial off-the-shelf hub platforms have matured a bit, it should be easier to resist the temptation to get under the hood and customize them. Most vendors are still revving their products as often as twice a year, so you definitely donât want to get into a situation where you are ârev lockedâ to an older version.
19. Stay current with vendor-provided patches:
Given the frequency of point releases, patches and major upgrades, you should probably plan for at least one major upgrade during the initial implementation, and be sure to build âupgrade competencyâ in the team that will maintain the hub platform after the initial project goes live.
20. Carefully plan deployment:
With increasing MDM complexity, training of business and technical people is more important than ever. Using untrained or semi-trained systems integrators and outsourcing attempts caused major problems and project delays for master data management users.
21. Test, test, test and then test again:
This is like the old saying about whatâs important in real estate â âlocation, location, locationâ. Your MDM hub environment is going to be different, by definition, than every other environment in the world.
Before we dive into the topic, I want to take a step back and explain what is QRCode: QR Code (abbreviated from Quick Response Code) is the trademark for a type of matrix barcode (or two-dimensional code) first designed for the automotive industry. More recently, the system has become popular outside the industry due to its fast readability and large storage capacity compared to standard UPC barcodes. The code consists of black modules (square dots) arranged in a square pattern on a white background. The information encoded can be made up of four standardized kinds (“modes”) of data (numeric, alphanumeric, byte/binary, Kanji), or through supported extensions, virtually any kind of data.(per wikipedia).
To me, QRCode is an amazing magic wand that has the power to connect analog world to the digital world. It has the power to engage a motivated customers who is scanning QR Code and convert them to loyalists. From the day I was introduced to QR Code to today, I am extremely excited for what QR Code is worth, but at the same time, severely impacted by how underutilized it is. For the sake of this blog, and to understand what stores near-by are doing with their QRCode, I visited my nearest mall and clicked photos of the first few QR executions. To my surprise, it did not take me much to find or click quick snapshots of few different type of QR implementations. But amazing thing is that they all are doing it wrong. I will get on it soon. QRCode is facing some challenges with adoption, but with capable mobile devices, it is bound to pick up if it has not already. With this slow QR Code adoption, the only thing retailers need is a lousy execution throwing away users from using this amazing digital wonder of the world.
So, what are retailers doing wrong?
QRCode deployed covers used cases ranging from âsignup with our mailing listâ, âdownload our appâ, to âvisit our social pageâ. There was no consistency in execution. Every store wants users to juggle in different ways. Below are 5 used cases that I came across. It is very likely that most of the retail store QRCode implementation falls into one of these. I can understand that retailers are still experimenting with QRCode projects and understanding the impact. But consider this: A user, who is motivated to click a QRCode, puts in considerable effort to do lots of clicks to get to other side. So, what is it all worth- A facebook like, a twitter follow, an app download or a signup for mailing list? Having a QRCode should be taken as similar to having a domain. Try having a domain name pointing to all these services. Just like domain names, QRCode are precious as well. It is a perfect way to engage an already committed user. So, why throw vague click-to-action at them. Why not grab their attention for something that is win-win for both the retailer and the customer.
Following are implementations of retail QRCode – âThe Good, The Bad and The Ugly sidesâ.
1. I got this image from someone and found it very interesting to share. It has some pros and cons to it.
The Good: QR Code is sitting in the primary location, gate is the first interface and attaching QR Code made it easily accessible. So, kudos for that.
The Bad: Is âfacebook likeâ or â twitter followâ that important? If a user jumping through hoops to scan a QRCode, to like a facebook page is appreciative of their effort? Is it providing enough value to retailer or user?
The Ugly: See at the end.
2. This is another example from a smoothie joint near the mall area, closer to my place.
The Good: It is great to separate interest groups, people with different intent will pick appropriate click to action. Here Yelp and Facebook audience are provided with separate QRCode.
The Bad: Confusion. With limited adoption and involvement, it is way too risky to have 2 QRCodes. It also exposes the campaign to technical issues. What if user scans them from distance that both QR show up etc. This implementation raises more questions than answers.
The Ugly: See at the end.
3. This is taken from a nearby Van Heusen outlet store from nearby mall.
The Good: $5 is very appreciative for the effort user is going through. This is gratifying users for their effort. $5 worked magic when it comes to fixing the eyes to banner as well.
The Bad: Â Confusing plate, 2 offers are bundled into one plate. It could confuse users. Text & QR are packaged into one.
The Ugly: See at the end.
4.Â This image is taken at a local GMC store. I like the way they explained the used case. I find no problem in understanding how to use it. But then, I am not sure if itâs usable for all audience.
The Good: Very well laid out plan on how to engage with the used case.
The Bad: Only caters to deal hunters; what if you are not here for deals?
The Ugly: See at the end.
5. This image was taken from a nearby Costco. I visit there often and never paid attention to this banner until recently.
The Good: The position of the banner. It was kept right above the checkout counter. In case a long queue is awaiting, user could use that waiting time to indulge with the banner.
The Bad: A descriptive banner, with fine prints and asking user to download an app. App is very intimate to users due to limited real-estate on mobile phone. It is asking the user too much of commitment while waiting.
The Ugly: See below. The Ugly: Almost all the used cases suffer from the same issue, not creating a bi-directional engagement interface with user. QRCode is used when user is actually physically present in the store and scanning. Also, it is not known at the moment what the user could be suffering from, so, selling them something without knowing what they want to buy is not a great idea. So, it is important for retail stores to provide a dashboard that could better address their current need (tools to help a surfing user) and once current issue is addresses, provide them with an opportunity to convert those users by offering app download links, social follow buttons or email newsletter signups.
From the observation above, few things stand out. Retailer gets the importance of QRCode but still lack a used case that will help engage their customers better with their brand. As with all great brands, listening is as important of a task as talking. So, why QRCode should be any different? They should also have ability to listen to customer as well as talk to them. Therefore QRCode should be primarily thought of as a tool to engage an active customer.
It is important to look at QRCode from a different lens. Unlike, facebook like, mobile app, social follow, QRCode is used when user is actively engaged in a store, so selling them engagement tools for future might not be something that is uber targeted. However, one could obviously cross-sell those tools, on landing page when user scans a QRCode. So, QRCode interfaces should be handled differently and should not be mixed with loyalty tools.
So, an Ideal QRCode should have following components:
1. Single QRCode addressing all the needs of the user.
2. A well accessible placement of the QRCode, making it easily discoverable.
3. Well laid out procedure to help users engage with QRCode.
4. QRCode bringing users to a super dashboard that could help them in any possible way it can. i.e. providing product descriptions, deals, specials, live chats, app links etc.
5. Providing capability for users to leave comments, complaints, suggestion and fill surveys.
6. Ability to further help users extend the engagement by providing links to social media channels, apps, email list, newsletters etc.
7. Providing access to email list.
Based on the business, users, and used case, there may be more of less used cases as stated above, but the overall coverage should be pretty much same.
So, an advice to all the retailers, get back to whiteboards and rethink existing QRCode strategy. It is a big pot of gold if done right. As holiday season is approaching, this could be a great opportunity to connect with masses and engage with them by designing a perfect system.
Data scientists are not dime a dozen and they are not in abundance as well. Buzz around bigdata has produced a job category that is not only confusing but has been costing companies a lot in their stride to look through the talent pool to dig for a so called data scientist. So, what exactly is the problem and why are we suddenly seeing a lot of data scientist emerging from nowhere with very different skill sets? To understand this we need to understand the bigdata phenomena.
With emergence of big data user companies like Google, Facebook, yahoo etc. and their amazing contribution to open source, new platforms have been developed to process too much data using commodity hardware in fast and yet, cost efficient ways. Now with that phenomenon, every company wants to get savvier when it comes to managing data to gain insights and ultimately building competitive edge over their competitors. But companies are used to understanding small pieces of data using their business analysts. But talk about more data and more tools. Who will fit in? So, they started on lookout for special breed of professional that have the capability to deal with big data and it’s hidden insights.
So, where is the problem here? The problem lies in the fact that only one job title emerged from this phenomenon- data scientist. The professionals who are currently practicing some data science via business analysis, data warehousing or data designing jumped on the bandwagon grabbing the title of the data scientist. What is interesting here is that data scientist job as explained above does not deserve a single job description so it should be handled accordingly. It was never a magical job title that has all the answers for any data curious organization, to be able to understand, develop and manage a data project.
Before we go into what companies should do, let’s reiterate what is a data scientist. As the name suggest, it is something to do with data and scientist. Which means it should include job description that has done some data engineering, data automation, and scientific computing with a hint of business capabilities. If we extrapolate, we are looking at a professional with computer science degree, doctorate in statistical computing and MBA in business. What would be luck in finding that candidate and by-the-way, they should have some industry domain expertise as well. What is the likelihood that such a talent exists? Rare. But, even if they are in abundance, companies should tackle this problem at much granular and sustainable scale. And one more thing to note here is that no two data scientist job requirements are the same. This means that your data scientist requirement could be extremely different from what anyone else is looking for in a data scientist. So, why should we have one title to cater to such a diverse category?
So, what should companies do? First it is important to understand that companies are building data scientistsâ capabilities and should not be hiring the herd of data scientists. This means that companies/ hiring managers should understand that they are not looking for a particular individual but a team as a solution. It is important for businesses to clearly articulate those magic skillsets that their so-called data scientist should carry. Following this drill, companies should split the skillset into categories, Data analytics, Business analyst, data warehousing professionals, software developer, and data engineers to name a few. Finding a common island where business analysts, statistical computing modelers and data engineers work in harmony to address a system that handles big data is a great start. Think of it as putting together a central data office. Huh! another buzz word. Donât worry; I will go into more details in the follow-up blogs. Think of it as a department where business, engineering and statistician work together on a common objective. Data science is nothing but an art to find value in lots of data. So, big-data is to build capability to parse/analyze lots of data. So, business should work through their laundry list of skillset. First identify internal resources that could accommodate that list. Following this, companies should form a hard matrix structure to prove the idea of set of people working together as a data scientist. BTW I am not saying that you need one individual from each category, but, together the team should have all the skills mentioned above
One important take away for companies is to understand that the moment they came across a so called data scientist, it is important to understand which side of data scientist the talent represents. Placing that talent in their respective silo will help provide a clearer vision when it comes to understanding the talent and understanding the void that could stay intact if the resources are not filled accordingly. So, living in this convoluted world of data scientist is hard and tricky. Having some chops into understanding data science as a talent, companies could really play the big data talent game to their advantage and lure some cutting edge people and grow sustainably.
When you encounter apparently unmanageable and insurmountable Big Data sets, it seems practically impossible to get easy access to the correct data. The fact is, Big Data management could prove to be highly tricky and challenging and may come up with a few issues. However, effective data access could still be attained. Here are a few strategies for effectively achieving superlative data connectivity.
Hadoop is actually an ecosystem that has been designed for helping organizations in storing mammoth quantities of Big Data. It is important for you to have a sound understanding of ways to successfully bring your Big Data into and take it out of Hadoop so that you could effectively move ahead, as companies are involved in handling challenges of integrating Big Data within already existing data-infrastructure.
Integrating Cloud Data with Already Existing Reporting Applications
Integrating Cloud Data with already existing reporting applications such as Salesforce Dx has totally transformed the way you perceive and work with your customer data. These systems would, however, could face certain complications in acquiring the real-time reports. You would be relying on these reports for perfect business decision-making, thus generating the demand for an effective solution that allows such kind of real-time reporting.
Do Not Let the Sheer Scale of Big Data Get to You
Big Data could be hugely advantageous for businesses but if your organization is not ready to effectively handle it, you may have to do without the business value Big Data actually has on offer for you. Some organizations have the necessary scalable, flexible data infrastructure required for exploiting Big Data in order to achieve crucial business insight.
Access Salesforce Data via SQL
Salesforce data actually provides great value for numerous organizations; however, access issues could prove to be major obstacles in the way of organizations from reaping the fullest possible advantage. However, now businesses could effectively have an easy access to Salesforce data through ODBC and SQL. These smart drivers would be allowing you to create a connection and start implementing your queries in just a few minutes.
Do Accurate Analysis of Big Data
You could get a greater accuracy of big data depending on the technology utilized. There are several Big Data platforms that could be chosen by a company such as Apache Spark and Hadoop could come up with unique and accurate analysis of Big Data sets. More cutting-edge Big Data technology would be successfully generating more state-of-the-art Big Data models. Many organizations would be opting for a reliable Big Data provider. There are a great variety of options open to them and so businesses today could easily locate a Big Data provider that is suitable for their specific requirements and that comes up with accurate results or precise outcomes.
Business organizations must take extra initiative in assessing and analyzing the data collected by them. They must make sure that the data is collected from an authentic and reliable source. They must identify the context behind data generation. Every move involved in the analysis process requires being observed carefully right from the proper data ingestion to its enrichment and preparation. Data protection from external interference is essential.
Sujain Thomas is a Salesforce consultant and discusses the benefits of Salesforce Dx in her blog posts. She is an avid blogger and has an impressive fan base.