Six Do’s and Don’ts of Collaborative Data Management

Data Quality Projects are not technical projects anymore. They are becoming collaborative and team driven.

As organizations strive to succeed their digital transformation, data professionals realize they need to work as teams with business operations as they are the ones who need better data to succeed their operations. Being in the cockpit, Chief Data Officers need to master some simple but useful Do’s and Don’t’s about running their Data Quality Projects.

Let’s list a few of these.

 DO’S

 Set your expectations from the start.

Why Data Quality? What do you target? How deep will you impact your organization’s business performance? Find your Data Quality answers among business people. Make sure you know your finish line, so you can set intermediate goals and milestones on a project calendar.

Build your interdisciplinary team.

Of course, it’s about having the right technical people on board: people who master Data Management Platforms. But It’s all also about finding the right people who will understand how Data Quality impacts the business and make them your local champions in their respective department. For example, Digital Marketing Experts often struggle with bad leads and low performing tactics due to the lack of good contact information. Moreover, new regulations such as GDPR made marketing professionals aware about how important personal data are. By putting such tools as Data Preparation in their hands, you will give them a way to act on their Data without losing control. They will be your allies in your Data Quality Journey.

Deliver quick wins.

While it’s key to stretch people capabilities and set ambitious objectives, it’s also necessary to prove your data quality project will have positive business value very quickly. Don’t spend too much time on heavy planning. You need to prove business impacts with immediate results. Some Talend customers achieved business results very quickly by enabling business people with apps such as Data Prep or Data Stewardship.  If you deliver better and faster time to insight, you will gain instant credibility and people will support your project. After gaining credibility and confidence, it will be easier to ask for additional means when presenting your projects to the board. At the end remember many small ones make a big one.

DON’TS

Don’t underestimate the power of bad communication

We often think technical projects need technical answers. But Data Quality is a strategic topic. It would be misleading to treat it as a technical challenge. To succeed, your project must be widely known within your organization. You will take control of your own project story instead of leaving bad communication spreading across departments. For that, you must master the perfect mix of know-how and communication skills so that your results will be known and properly communicated within your organization. Marketing suffering from bad leads, operations suffering from missing infos, strategists suffering from biased insights. People may ask you to extend your projects and solve their data quality issues, which is a good reason to ask for more budget.

Don’t overengineer your projects then making it too complex and sophisticated.

Talend provides simple and powerful platform to produce fast results so you can start small and deliver big. One example of having implemented Data Management from the start, is Carhartt who managed to clean 50,000 records in one day. You don’t necessarily need to wait a long time to see results.

Don’t Leave the clock running and leave your team without clear directions

Set and meet deadlines as often as possible. It will bolster your credibility. As time is running fast and your organization may shift to short term business priorities, track your route and stay focused on your end goals. Make sure you deliver project on time. Then celebrate success. When finishing a project milestone, make sure you take time to celebrate with your team and within the organization.

 

To learn more about Data Quality, please download our Definitive Guide to Data Quality.

 

The post Six Do’s and Don’ts of Collaborative Data Management appeared first on Talend Real-Time Open Source Data Integration Software.

Source: Six Do’s and Don’ts of Collaborative Data Management

August 14, 2017 Health and Biotech analytics news roundup

Biohackers Encoded Malware in a Strand of DNA: University of Washington researchers encoded a sequence of DNA to cause a buffer overflow in the program used to compress it, potentially allowing an attacker to take control. Note, however, that the attack was not always successful, and that the program had been modified to allow the attack to take place.

Geneformics Announces the First Truly Scalable Genomics Data Compression Solution to Accelerate the Migration of Precision Medicine to the Cloud: The large size of genomics data limits the ability to store and analyze it, which this solution helps to address.

The 5 Smartest Companies Analyzing Your DNA: Overviews of 23andMe, Illumina, Oxford Nanopore, Sophia Genetics, and Veritas Genetics.

New tumor database deployed to battle childhood Cancer at UC Santa Cruz: The database is public and free to use at https://treehouse.xenahubs.net.

Deep Learning Thrives in Cancer Moonshot: The CANcer Distributed Learning Environment (CANDLE) project is looking to develop new models and frameworks to help make new cancer therapies.

Source

Building Data Dashboards for Business Professionals

Blog

Everyone wants to get more out of their data, but how exactly to do that can leave you scratching your head. Our BI Best Practices demystify the analytics world and empower you with actionable how-to guidance.

Preserving insights

One of the biggest pitfalls in data is the preservation of insights when analysis is handed off from the data team to a business professional. Often, the data experts have been exploring the data for a while, developing a clear sense of its structure, assumptions, and conclusions. The data analyst has had a great opportunity to pinpoint an insight, but when it comes to sharing that work with the business person who will ultimately make a decision, they fail to fully communicate that information. Here are some tips to help you improve your data visualization so that you can add the most value to your teams via a dashboard.

The first step to making the most out of data collaborations is to set up a meeting where you discuss the relevant business questions. I outlined some important guidelines for this meeting in 6 Tips for Data Professionals to Improve Collaboration, so if you haven’t already read that, it’s a good place to start. Data experts should expect to come out of that meeting with a list of questions that they can translate into queries to build the initial dashboard. 

In this post, we’ll cover the next steps in the dashboarding process.  

BI & Analytics for Business Analysts

Work from a blueprint

The first step in building a great dashboard is to review that list of questions and group them into larger buckets. When you read the entire list, think about which themes emerge. When you’re identifying those buckets, you want to look for general topics that are only answered by combining a few of the individual questions. In the ideal scenario, all of your individual questions can be grouped into a handful of broader themes.

Once the questions are grouped into buckets, it’s time to build a blueprint of the dashboard using those general themes as the headers and the individual question as the charts that fit underneath. My personal blueprinting process uses a lot of sticky notes for this. I’ll write down each of the questions on my list, then arrange them into groups on paper until I’m happy with the design. It also helps me to make sketches of the individual charts to get the most complete idea of the final dashboard. 

At this stage, I’m making decisions about what is included in each chart. For example, if there are three line charts next to each other that can be combined into one chart with three lines, this is the step to decide to do that. If I decide to combine charts, then I need to make sure the title and scale of that chart fit all the information appropriately. If I decide to keep three separate line charts, I think how can I format them to show the unique information clearly.

Getting your charts in order

Once all the organizational work is done, I start arranging the charts to match my blueprint. Every question from that original list gets turned into a chart that’s placed on my dashboard according to the blueprint I made in the first part of the process. 

I like to build a dashboard in horizontal layers, with the very top layer being the most important high-level KPIs and then each of the layers below tackling one of the buckets I identified in blueprinting. To help guide users to understand the purpose and context for each section of the dashboard, I often use text as signposts. Additionally, within each chart, I use titles, colors, and other visual cues to help the charts explain themselves. Finally, when deciding on the arrangement of the charts within each section, I start with the chart that will be most frequently referenced on the left and then work my way to the right in order of decreasing frequency of views.

For the additional layers, I like to provide context for those high-level charts. For example, if the most critical chart in the dashboard is a revenue tracker, the layer directly underneath has more charts that answer questions related to revenue. The next layer down contains more detailed information about the second high-level chart, and so on. This design strategy lets that first row of the dashboard act not only as a quick summary of the most important information, it also turns it into a table of contents for the rest of the layers.

In my ideal dashboard, the top layer has one high-level KPI chart to summarize each of the buckets I identified in my initial conversation with the business professional who requested this data. Then, each section underneath that contains answers to all of the related questions we listed that belong to that topic, starting with the information that will need to be referenced the most frequently and working down to the information that will be referenced the least. Data is messy and it doesn’t always fit neatly into those layers, but this mindset makes it easy to compartmentalize and organize a long list of charts.

Readability means insight preservation

The goal of your dashboard isn’t to allow business professionals to easily find answers, it’s to help them find the right answers easily.

An important thing to remember when creating a dashboard is that most of your consumers are busy professionals with their own long list of work priorities and deliverables. When they read your dashboards, they are most likely looking for quick answers to a particular problem. They need to take away the important learnings that you’ve found in the data, but they won’t have the same amount of time to spend studying the data as you did. This attention gap is a place where the insight can erode significantly, so you need to make sure that it’s easy to get the right insights quickly.

One of the best ways to focus a reader’s eye is through the use of color. Using the same color for all the charts related to one topic of your dashboard is a shortcut to making sure all of that information is digested together. The goal of your dashboard isn’t to allow business professionals to easily find answers, it’s to help them find the right answers easily.

It’s always good practice to title the charts as specifically as possible to minimize confusion about the insights. Translating the blueprint you designed to an actual live dashboard will always result in a few unexpected hiccups, so it’s crucial that you review the dashboard as a whole and the topical layers for places where insights could get lost in the handoff back to a business professional. Through iteration, your dashboard will not only have all the right data, but it will also have a form that resonates with the person taking action, leading to higher adoption and higher impact.

BI & Analytics for Business Analysts

Christine Quan is a seasoned data and analytics veteran, focused on data visualization theory and building tools to empower data teams. She’s an expert at constructing SQL queries and building visualizations in R, Python, or Javascript.

Source

October 3, 2016 Health and Biotech Analytics News Roundup

The latest health and biotech analytics news and opinion:

insideBIGDATA Guide to Healthcare & Life Sciences: The news source has come out with a white paper evaluating different areas of healthcare that require big data technology.

A $280 billion healthcare problem ripe for technology innovation and predictive analytics: Behavioral health analytics could potentially have a very large impact on health care costs.

The New England Journal of Medicine Announces the SPRINT Data Analysis Challenge: The contest is looking for new ways to evaluate clinical trial datasets.

Advances in Next-Generation Sequencing: Long reads, single-cell sequencing, and cancer screening with DNA in the blood are exciting new areas in DNA sequencing.

The Sequencing App and the Quest for Fun: Joe Pickrell, a Columbia biology professor, has launched a company providing low-quality, low-cost genomes. He hopes it will get more people interested in biology.

Originally Posted at: October 3, 2016 Health and Biotech Analytics News Roundup by pstein

How to Streamline Report Management to Enhance Your Data-Driven Business

Blog

A critical part of effectively exploring your data, transforming it into actionable insights, and enhancing decision-making for your business is being empowered to slice and dice your data, and be less dependent on technical resources for new updates. Improved visibility into insights will enable you to get more out of them. And as the business users, you’re best placed to identify which of them most closely meet your organization’s needs.

Analytics reports are a vital part of this process. But one of the perennial obstacles to getting the best insights is the way that reports are shared, which can affect how relevant and how valuable the data is. This, in turn, affects the quality of the insights that you can unlock.

BI & Analytics for Business Analysts

So, it stands to reason that improved access to reports, and their relevance, can hugely enhance the value of your data. Plus, more effectively governing the distribution of reports, levels of access to them and to different parts of data sets, can simultaneously reinforce their relevance and safeguard the use of your data. These factors can be influential in boosting the efficiency of your operations and your business performance.

Achieving this involves streamlining report management, providing you with more autonomy and control over the reports that you have access to, how and when you receive them, and what content you get. Similarly, business analysts need better control of when, how, and to whom they distribute reports, and what they send to different groups or individual users. Let’s look at some of the challenges that this process poses, and how you can overcome them.

What’s holding you back from efficient reporting?

Typically, business analysts have had to shoulder the responsibility of setting up and managing reports for every end user. They decide what content each of them receives, when and how often they get updates, and what level of access they have. All the users of a dashboard or the recipients of a report, get the report when the “owner” — the business analyst — decides it should be sent. Business users (end users) often aren’t able to request an individualized schedule or have reports tailored to them. Dashboards get shared among all users of that specific report, with no difference in content or timing.

Leaving the business analysts to manage this process is a lot of work and takes them away from their core functions. It also means that business users must rely on the analysts to make any changes to report settings, including unsubscribing where necessary, and they are dependent on the judgement of the analysts.

This is inefficient. First of all, there’s a bottleneck caused by end users waiting for report owners to create and distribute reports or make necessary changes. Secondly, these reports don’t precisely address the requirements of individual end users, because there’s too much management overhead to handle the individual subscription preferences of viewers at scale. Relevance suffers as users don’t necessarily get what they need, how they need it, and when they need it. They may get too much information, which they’re reluctant to wade through; too little information; or simply information that isn’t relevant, which renders any report useless. What follows is a decrease in user adoption, a reluctance to be data-driven, and ultimately a deterioration of decision-making within the business. All highly undesirable outcomes.

What developments in report management improve reporting?

Developments in report management deliver more control and precision to the way end users consume data and insights. They increase the efficiency of the way business analysts manage report distribution and scheduling. It’s all about governance.

Analysts no longer need to make all the decisions about reports and determine what goes to whom and when. That’s because end users can now unlock subscription settings so that they can choose when to receive specific reports and what to include in each one. You should look for an analytics platform that allows you to stipulate when a report or a dashboard should be emailed to you, at a time and frequency that fits in with your schedule. By applying data exploration filters, you should be able to customize each report with filtered values of your choice, so you get only the most relevant data for your specific needs in your own view of any dashboard. You should also be able to personalize the look and feel of each report according to your specifications and preferences, just like a business analyst does. Reports should then be embedded in an email or can be attached as a PDF report, and all of this can be done without assistance from a business analyst. Moreover, after all of this, if a report still doesn’t meet your needs, you should be able to unsubscribe with one click.

On top of all of this, the tool you choose should allow analysts to schedule report distribution at different times for different audiences. They should be able to push a business-critical report at a specific time; for example, if management wants to send a weekly sales tracker to all their retail store managers every Monday at 9 am. They should also be able to tailor the content that they distribute to end users, unsubscribe individuals or specified groups, and importantly, in an environment where data security is a concern, they should be able to easily disable or govern dashboard subscriptions that may contain sensitive data, so that only approved audiences can receive it.

What are the benefits?

For business analysts and data teams, there’s less burden to manage reporting, because end users can now refine their own report subscriptions. Bottlenecks decrease as the analysts no longer need to continually manage and make changes to each individual report subscription and scheduling. When they are involved, they can rapidly make changes across users at scale, thereby more closely satisfying the business users’ requests. Furthermore, they get more time back to focus on their more critical tasks for the business.

End users get more autonomy over analytics consumption, enabling them to leverage more contextually relevant insights that support their business processes. By putting more power into the hands of the business user, they can personalize exactly how and when they consume insights, the data they get is more relevant, workflows accelerate, and your business ultimately achieves faster time-to-insights. Plus, efficiency is improved, as resources aren’t getting wasted because recipients are no longer being inundated with reports that they don’t find valuable or relevant.

A Win-Win for All

As businesses handle an increasing volume and complexity of data, it becomes more and more imperative that BI and analytics platforms can simplify the analytics process and customize dashboards. These developments in report management make data consumption and insight generation as seamless and relevant as possible to more users than ever.

With them in place, end users’ experience is better, the insights they get are far more precise and specific to their needs, and so user adoption increases. Report fatigue goes down, users’ engagement with critical KPIs goes up, and everyone gets more value from your BI and analytics.  As a result, you strengthen data-driven decision making throughout your organization, and you boost its significance as a dynamo of growth for your business.

BI & Analytics for Business Analysts

Originally Posted at: How to Streamline Report Management to Enhance Your Data-Driven Business

How Kent State University is streamlining processes for recruiting and admitting students

Kent State University is a public research university located in Kent, Ohio, with an enrollment of nearly 41,000 students. 

Success in recruiting qualified students in sufficient numbers is the lifeblood of any university. In its efforts to aggregate data related to admissions, Kent State found itself dealing with a “spaghetti mess”, further complicated by its hybrid environment. Currently, the university relies on an on-premises Banner ERP system, but its Salesforce CRM and other SaaS applications live in the cloud.

Facilitating the transition to a cloud-based environment

To find the right solution to serve as a centralized integration hub, Kent State put out an RFP and evaluated software from several vendors. The university considered utilizing Talend because it provides data integration, ESB, data quality and master data management all in one solution. The university decided to deploy for Talend Cloud on Amazon Web Services (AWS). The cloud-native character of Talend also helped decide Kent State in its favor.

“Data is the currency of higher education. It enables us to build relationships and understand how to engage students, faculty, staff, researchers, and alumni more effectively” – John Rathje, CIO

Talend played a huge role in supplying a wide range of data to multiple organizations in Salesforce. Talend enabled the school to integrate between 25 and 50 separate sources containing purchased lists of names of prospective students and import the data into Salesforce to be used in recruiting communications. Talend’s prebuilt connectors, and especially the Salesforce connector, streamlined the Kent State’s typical processes.

Managing the admissions process

A key Kent State system that relies heavily on Talend is CollegeNET, the school’s CRM system for managing the admissions process for graduate and international students. Talend is the critical component that integrates CollegeNET with Banner, the ERP widely used in higher education.

By catching faulty data early, Talend Cloud Data Stewardship has also eliminated the need for admissions staff to manually change data in Banner. That data cleansing process used to take up to 20 minutes per applicant and has now been significantly reduced. Currently, Kent State’s main ERP and data warehouse are on-premises, but plans are to move both source and target systems to the cloud. “Once we’re there,” says Holly Slocum, Director of Process Evaluation and Improvement for Kent State, “the flexibility the Talend cloud engines give us will enable us to avoid moving data in the cloud to an on-prem remote engine, then back up to the cloud. We also plan to look into installing Talend in an AWS or Microsoft Azure instance. If we want to take advantage of services from cloud providers, we’re not stuck with running the engine on-prem.” 

<>

The post How Kent State University is streamlining processes for recruiting and admitting students appeared first on Talend Real-Time Open Source Data Integration Software.

Source

BigData At Work: Used Cases [Infographic]

BigData At Work: Used Cases [Infographic]
BigData At Work: Used Cases [Infographic]
BigData is word every company endorse to. They have massive data and yet to fail them in the most optimal way that generate consistent value to them. Big Data holds big insights, so digging in on big-data provides great opportunity. Consider a case where you could predict your churn, customer satisfaction, product demand and business outcome. How you will react to it. What if your models could bring certainty and predictability into the decision-making process?
Participants of Smarter Analytics Leadership Summit were asked some questions around their use of big-data and analytics. Surveyors are finding highly intelligent and profitable answers in clever analytics software and services that can process all the different kinds of data and make it more useful in key business decisions and processes — with impressive results. Following infographics is created explaining some of the most common used cases.

BigData At Work: Used Cases [Infographic]
BigData At Work: Used Cases [Infographic]

Originally Posted at: BigData At Work: Used Cases [Infographic] by v1shal

Trust the Process

My son and I are really excited about the new NBA season. We are Atlanta Hawks fans, so we’re not too optimistic about this year. We know the team is young and has decided to undertake a rebuilding process. Our mantra for this season is the now familiar “trust the process”.

If you’re not aware of the phrase “Trust the Process” comes from the Philadelphia Sixers rebuilding efforts over the past couple of years. What’s most interesting to me is that the formula for team success is much broader now. It is no longer just about having great players, but free agency positioning, analytical prowess, superior facilities and developing long-term successful franchises.

It’s all about the process now.

I find the same can be said for delivering customer data and dashboard solutions.

Much of the historical focus when deploying data applications (customer dashboards, embedded analytics, etc.) has been on selecting the right tool. However, despite so many more great tools and increased investment in the BI space, successful implementation rates have not improved.

In a research piece by Dresner Advisory Services from May of this year, they highlight the fact that successful BI implementations are most often tied to having a Chief Data Officer (CDO). This makes a lot of sense because the CDO is just like an NBA team’s general manager. They bring accountability and experience as well as a process to make customer data solutions successful.

Here are some elements that make process so valuable to delivering data applications and solutions.

  • Launch Dates – A process is the best way to mitigate against missing the launch date. The existence of checklists, status updates, and documentation offer a means to anticipate risks that cause delays. Remember that delays to the product launch or release directly impact revenue and reputation. Missing product launch dates is not something that goes unnoticed.

  • Customer Credibility – When delivery dates are missed, requirements miss the mark or dashboard designs don’t serve their audiences product confidence is lost. Its not only the customer’s confidence that we need to be concerned about, but also the sales and marketing teams. Once we lose the trust of these audiences it takes time to regain it, not unlike sports teams who fail to deliver winning teams over many years (see: New York Knicks).

  • User Engagement – When there is no process that means there’s no planned effort to understand the audience and deliver the dashboard design. If users can’t understand the data you’re sharing with them, a cancelled subscription is a near certainty.

  • Applications, not Dashboards – The best dashboards are purpose-driven applications. Tools don’t deliver purpose. The process undertaken to understand and solve a real problem delivers a purposeful solution.

  • A Complete Platform – A dashboard solution is only a means of displaying data. A process defines ALL the requirements. Having a process recognizes that a complete solution is needed which includes security, user administration and application performance optimization.

Much like NBA success, a successful customer dashboard implementation isn’t about picking a product (player), but sustained success over many years of increased subscription (tickets) revenue, fan engagement and loyalty. The path forward for distributing and delivering on valuable data applications is all about your process.

In the event that you don’t have a process or a CDO leading your efforts, click here to learn about the Juicebox Methodologies. It’s our way to design and deliver successful, on-time applications as well as wildly loyal fans. Trust the process. It works.

Source

Top 5 Ways AI Will Impact the Insurance Industry in 2019

I recently saw a tweet from Mat Velloso – “If it is written in Python, it’s probably machine learning. If it is written in PowerPoint, it’s probably AI.” This quote is probably the most accurate summarization of what has happened in AI over the past couple of years.

A few months back, The Economist shared the chart below that shows the number of CEOs who mentioned AI in their Earnings calls.

Towards the end of 2017, even Vladimir Putin said: “The nation that leads in AI ‘will be the ruler of the world.” Beyond all this hype, there is a lot of real technology that is being built.

So how is 2019 going to look for all of us in the insurance world?

Source: Top 5 Ways AI Will Impact the Insurance Industry in 2019 by analyticsweek

The Significance of Self-Service Analytics: Advice from Real Application Teams

It’s a seemingly obvious but often-missed point: Different application end users will want to use embedded analytics in different ways. If you’re like most application teams, you’ll have everyone from basic users who want easy-to-use, interactive dashboards, to power users who demand sophisticated capabilities like workflow integration and operational reporting.

How can you meet the needs of every end user? When you embed self-service analytics in your application’s dashboards and reports, you empower all your end users to get the data and analytics they need—without requiring constant technical assistance.

>> Related: 7 Questions Every Application Team Should Ask Before Choosing an Analytics Vendor <<

The fact is, end users don’t want to send multiple ad-hoc reporting requests to your development team. They’re much happier when they can get the information they need on their own. Users want to do whatever they want with their data—which makes self-service analytics an important feature for any application.

The application team at Fieldology learned this firsthand when they were building a new application. They decided to engage their users by putting them in control to get the data they need, create new dashboards and reports, and explore information on their own:

 

“We believe that success is not about the collection of data, but how you use that data to make an impact on the bottom line. Creating portals and pre-built reports was a great start, but it soon became clear that we would need to develop more ad-hoc and self-service reporting capabilities. We needed to expand our solutions so they could be used by everyone from senior management, analysts, account teams and the field staff.”

 – Paul King, Fieldology

 

End users aren’t the only ones who benefit from self-service analytics. Embedding self-service means reducing the backlog of custom requests for your developers and IT team. In fact, according to the 2018 State of Embedded Analytics Report, 49 percent of companies saw a drop in the number of ad-hoc requests from end users after embedding self-service analytics. Since users no longer have to come to you with every ad-hoc analytics request, it frees your application team up to focus on more important matters.

The application leaders at Signeture Solutions and Hylant were both happy with the dual benefits of self-service analytics:

 

“Many of my customers are business decision makers, and they don’t want to have to wait a week for IT to give them a specific chart or calculation. IT will always have its place for installation and maintenance. But now, IT can simply provide all the information to business users and allow them to make decisions independently based on what they see in the data.”  

– Barry Nicolaou, Signeture Solutions

 

 “We’ve been able to push back report requests from users to the business unit, and say ‘Hey, we gave you this data, and you can now utilize it as you see fit.’ They’re not asking for just a report. They actually have the data available to them and they can make their own correlations and build on their knowledge level.”

– Scott Lindsey, Hylant

 

Self-service analytics is a great way to engage end users and make your application stickier. Nearly 70 percent of application teams saw an increase in the time spent in their applications after they embedded self-service, as shown in the 2018 State of Embedded Analytics Report.

According to the team at iDfour, their users are so excited about self-service analytics they can barely even get through a demonstration:

 

“When we show a new customer the self-service tool, they usually want to start asking questions of their data right away – before I can even finish explaining the features. Once they see what’s possible, they want to dive in immediately.”

– Deanna Antosh, iDfour

 

Ready to embed self-service analytics in your application? Find out how Logi Analytics can help you engage end users and reduce the backlog of ad-hoc reporting requests. Watch a demo today >

 

Source