Apparently, Uber has a company tool called “God View” that reveals the location of Uber vehicles and customers who request a car. This tool can allow a wide number of Uber employees to view customers’ locations. So this is “my data” and now Uber has it! So what's the trend? Having differentiated customer data will be a source of competitive advantage from now on. And for this, gaining consumers’ confidence will be key. Companies that transparently inform customers about the information they gather, give customers control of their personal data and offer fair value in return for it will be trusted. Consumers will even give such companies larger access and a disproportionate share of their wallet. Soon, investors will value this, and it will reflect in stock prices, too! Over time customers would demand that their personal telecom, credit cards and banking data be available in a personal data warehouse. Yet many industries still take a short-term view of this trend. Big banks have even tried to choke the flow of data to popular websites that help consumers manage their finances. In India, I am not sure how leading banks like HDFC and ICICI would respond to requests for electronic receipt of my data. Banks, facing increasing competition from these companies, are becoming more protective of their customer information and are limiting how much data they pass on. And yet, by allowing third-party software access to data, a bank could act as a platform for third-party innovation, just like Apple acts as a platform for developers through its App Store. In fact, this becomes even more critical in the healthcare arena. Shouldn’t patients be the owners of their own medical data? The U.S. government has an interesting Blue Button Initiative . This protocol is already providing a secure way for veterans and Medicare beneficiaries to share their medical history with healthcare providers they trust. You can use your health data to improve your health and to have more control over your personal health information and your family’s healthcare. Power of APIs Many businesses are still not getting the fact that using APIs can be a powerful force multiplier for them. Put simply, an API is a set of instructions that allows one piece of software to interact with another. In general, as the array of API-enabled devices and services grow, so does the range of ways that they can be connected. An interesting UK government report suggested that: "You can even connect the lights in your living room to ESPN so that they flash when your football team scores, or to your calendar so that they blink on your birthday." The growth in the use of public APIs reflects the fact that there are a number of ways in which organizations can benefit from allowing their software and data to interact with third parties. For some companies, their APIs are their core business model. Twillio, for instance, provides a service that allows partners to send and receive voice and SMS communications.When a customer receives an SMS message telling them that their Uber driver has arrived, this is powered by the Twillio API. An interesting proposition Richard Thaylor, a professor of economics and behavioral science at the Booth School of Business at the University of Chicago has an interesting proposition. This is what Thaylor provocatively suggests: “If a business collects data on consumers electronically, it should provide them with a version of that data that is easy to download and export to another website. Think of it this way: you have lent the company your data, and you’d like a copy for your own use.” I think this is a powerful idea and can have a huge impact on consumers and also create a huge number of intermediary companies who help consumers make sense of their data. The idea of using analytics as “personal power” will definitely be resisted by companies and governments, but it has the power to allow the customer to control and manage her relationships with telecom, retail and myriad other companies. Here is how Thaylor explains this can happen: “Mydata” is the term we use in the United Kingdom; in the United States it’s called “smart disclosure.” The idea is that in many cases we can help consumers simply by making their own usage data available to them. Here’s an example: You’re searching for a new smartphone calling plan. What you’d like is access to all the ways you use your smartphone – in a machine-readable format. That would create a business opportunity for online services that I call “choice engines.” With one click you could upload all of your usage data, and the choice engine would recommend plans that suit your needs. We can help people make smarter decisions across many areas of their lives just by giving them access to their data." My experience of working in large banks and retail companies has been that often thoughts like these can be seen as fluffy and not hard “revenue producing” ideas. This really is about becoming more customer centric. In many companies, such as banks, business silos make customer-centric initiatives far more difficult. As marketers in service companies, often the power does not lie with the CMO to drive such thinking forward. Yet as industries are getting disrupted, maybe this is the time to make CMO voices heard and take steps to become more customer centric. How does all this matter to companies? 1. Business is no longer the gatekeeper to data. Customers will demand their data and intermediaries will build a solution that offers “customer value” on top of it. Banks, retailers and telecom companies will need to follow this trend and partner their customers for making money. 2. Marketers can help customers lead a better life. Whatever business you are in, customers will want you to add value to their lives. Helping customers use their own data in creative new ways can be a great differentiator. Customer data can be used to benchmark customers. Customers would love to know how their telecom spends compare with someone of a similar profile. Am I spending too much time on the phone lately as compared to others? Or how many hours of kids’ television programming does my household watch as compared to others? Customers may willingly provide more data (information about their family or interests) in return for getting additional value. 3. Creating a personal data product business. Most millennials are data natives. A data native is someone who expects their world to not just be digital, but to be smart and to be able to personalize to their taste and habits. For example, a bank should not only be digital and interactive – it should be personalized. It should tell you what you need to know based on your interests, location and preferences. Data products provide context and personalization. If your brand has the customer's trust and if you can help them improve their lives, they may even be open to giving you their data from other service providers. The expectations have shifted. Companies need to focus on creating a product based on data and making it valuable for their customers. Read more on my blog about customers, analytics and how data can transform us. Article written by Ajay Kelkar Image credit by Getty Images, E+, Vertigo3d Want more? For Job Seekers | For Employers | For Influencers
"Chatbots are computer programs that mimic conversation with people using artificial intelligence. They can transform the way you interact with the internet from a series of self-initiated tasks to a quasi-conversation," according to Julie Carrie Wong of The Huffington Post. They can now recognize objects in images and video and transcribe speech to text better than humans can. A bot might be able to process your credit card faster than a human (though humans can still do things bots can't even comprehend). There’s Siri in our iPhones, Alexa in Amazon’s Echo and Facebook Messenger’s PSL (Pumpkin Spice Latte) bot. And David Marcus, vice president of Facebook Messenger, says it can’t escape the fact that bots are still in early staging and require ongoing testing, measuring and improvement. Yet the tools and platforms for creating and hosting bots are becoming more widespread and so easy to use that you may be making your own bots within a year or two. Versatility of bots and AI "AI is all around us, from searching on Google to what news you see on social media to using Siri," said Babak Hodjat , co-founder and chief scientist of Sentient. "And with the momentum around AI growing every day, it’s not surprising that some of the most innovative retail sites have recently been experimenting with the use of AI, as well." This reminds me of the popular TV show “Undercover Boss” where company owners disguise themselves as regular staff members and work with everybody else. Some of the discoveries they make are real eye-openers – leading to changes that make the company more efficient, profitable and enjoyable for everybody to work in. Likewise, "both chatbots and AI together form a small but significant step in revolutionizing the way enterprise solutions are supposed to be – simple, intuitive, and engaging," said Siddharth Shekhawat , CEO and Co-Founder of Engazify. What makes bots noteworthy and special is that they can easily be integrated into some of the existing communication platforms used by businesses in order to give an in-app experience to its users. Bots are user-friendly for no programming required Take San Francisco-based Motion AI , for instance. No programming skills required. Regardless of how simple or complex your use case is, Motion AI streamlines the entire process. While there are thousands of chatbots everywhere, if you can simply draw a flowchart, you can create a chatbot. This is how it works – diagram your conversation flow, connect your bot to a messaging service and go. Motion AI allows you to deploy Node.js code directly from their interface. This makes for an excellent product when integrating your bot with third-party APIs, databases and services. Bots versus chats Kemal El Moujahid , lead product manager for the Messenger team at Facebook, puts it this way: "Typically, a bot would be useful to retrieve information for you, alert you at the right time or play a game with you. Very basic, very simple. When we opened the Messenger platform, it was natural to refer to these new experiences that developers could build for our users as “bots.” Some would tell you the weather, others would send you news, all of this automated and through chat. If you prefer the feel of sending a text to filling out a field, you’ll prefer the chat bot experience. Given such scenarios, one can conclude that there is much more to bots and machine translation. According to Mariya Yao , head of design and engagement at, "The practical applications are mind-blowing, as well. Computers can predict crop yield better than the USDA and indeed diagnose cancer more accurately than elite physicians." Google replaced Google Translate’s architecture with neural networks, and now machine translation is closing in on human performance. But as developers build more and more valuable and delightful experiences for consumers and businesses, leveraging all the tools in a platform, it’s become clear that “bots” are about much more than chat. Chatbots and AI landscape As it stands, the chatbot ecosystem is already robust, encompassing many different third-party chat bots, native bots, distribution channels and enabling technology companies. ( Laurie Beaver, Business Insider ) There’s a global market for messaging and mobile interactions with different features. Facebook Messenger and Whatsapp combined see more than 60 billion messages sent each day on the two platforms. Traditional text messaging clocks in at approximately 20 billion sent globally. Another example is Chinese messaging platform WeChat. Its P2P feature is accessed by 600 million of its users in China. There are, of course, many chatbot platforms that already exist. These tend to be bot-as-a-service platforms, where you can build, adapt and deploy your service in the cloud. For instance, WeChat uncovers how adding payments to the chat experience can have vast implications on user engagement. A look at the system According to Dr. Peter Norvig and Professor Stuart Russell : In order to pass the Turing Test, the computer system would need to possess the following capabilities: natural language processing knowledge representation automated reasoning, and machine learning "Natural language processing is possibly the hardest area of artificial intelligence to crack," said Paul Boutin , senior reporter of Chatbots Magazine. "A conversation held by a human customer service trainee who barely speaks the customer’s language is still beyond the reach of today’s NLP systems, or that rep would be out of a job. As it turns out, natural language processing is so complex that it isn’t just one field of research, it’s at least four." Chatbots often require a database backend. Your options range from the standard SQL databases, whose structure is often not compatible with natural language, or the more accessible NoSQL databases. You can also choose to use a graph backend. Graph database query languages can often easily mimic the connections present in natural language, which makes them ideal for building chatbots. ( Alexandra OrthFollow of Grakn.AI ) Grakn.AI is a database of AI with a reasoning query language (Graql) that enables you to query for explicitly stored data and implicitly derived information. It uses machine reasoning to simplify data processing for AI applications with less human intervention. Other options to look at include the Microsoft Bot Framework, WIT.AI, Pandorabots and API.AI. All provide a different set of features, integrations and a different level of usability. For any A.I. system that involves communicating with a human as a main part of its agenda, it is important for it to be able to use NLP as much as possible. So, NLP is not only useful but also necessary. That’s something we observe a lot nowadays with chatbots – for example, A.I. systems geared at emulating human communication through a web API, in order to convey useful information or facilitate a certain action. ( Zacharias Voulgaris, data science author ) As the lesser-known components of AI, Knowledge Representation and Automated Reasoning aren’t as commonly spoken about in the press but nonetheless play a key role in the creation of intelligent systems. Knowledge Representation allows us to make sense of the complexity in information. Automated Reasoning capabilities allows a system to “fill in the blanks” as there is no such thing as complete information or data with no gaps. ( Precy Kwan of Grakn.AI ) Another supportive element of the ecosystem is the actual user interface with which the end user engages with the bot – and this may take place across various platforms. For example, Facebook Messenger, a website’s chat facility, inside an app on a device or native (built into an operating system) or greater equivalent. ( Glenn Miller, digital strategist ) Best bots for businesses Adelyn Zhou, head of marketing at TOPBOTS  outlined recently the 100 best bots for brands and businesses leading their industries in messaging innovation. She said: "You’d never expect Mark Zuckerberg, king of social networks, to publicly admit that “messaging is one of the few things that people do more than social networking,” but the evidence is now indisputable. In 2015, Business Insider reported that messaging apps overtook social networks in popularity. Since then, every major messaging product – from Messenger to iMessage, Slack to Skype, Echo to Allo – opened up their platforms for “bots” and “micro-apps” which enable users to interact with brands and business without leaving the app. Leading companies from every sector jumped at the opportunity to use conversation and voice to engage customers in a scalable, personalized way." In closing Chatbots are well-suited for mobile (possibly more so than apps) as messaging is at the heart of the mobile experience. Wide consumer adoption of AI and machine learning-based bots is rapidly growing, as well. Creating your own bot has never been easier or more affordable than it is now with a variety of developer tools available from different platforms. Looks like chatbots are here to stay for a long time. Article written by Raj Kosaraju Image credit by Getty Images, E+, AndrewJohnson Want more? For Job Seekers | For Employers | For Influencers
Recently, we had a client come to use that had a unique request. They wanted to programmatically read a PDF file, find a barcode contained in it, read the barcode and get the value it represented and then rename the file to match the value contained in the bar code. Scanning a physical barcode and getting the value is child’s play. But reading it out of a file? This was tricky business, and one we weren’t sure we could pull off for a reasonable time and cost at first glance. We’ve been working with the Qt technology framework since the year 2000, and it is our natural ‘go to’ for any project to see if it can solve our problem. At its heart, it is a multi-platform windowing toolkit for C++, but over the years there have been many modules added that extend beyond just the presentation layer and with bindings to other languages. It is mature with a robust community that keeps surprising us with the solutions they come up with. What we want to do in this article is show an example of developing a complex application with minimal effort, that is combining elements of both desktop application programming and web programming. Our application needed to work in batch mode. It was going to be setup in the Windows scheduler and process a directory of files whenever it ran. We built all the normal stuff into it to deal with missing directories, flexible paths, duplicate files and such, but that was simple, it was reading the PDF file and finding/interpreting the barcode that was our challenge. Initially, we looked at the ImageMagic conversion utility, but it flat didn’t do what we needed it to do. We looked around and found a lot of tools, but all of them were missing one thing or another, like batch mode, or the license was a problem. Since a ready-made solution wasn’t at hand, we decided to see what we could homebrew. Since Qt was a mature technology, and it is trivial to create a PDF document with the QPrinter functionality, we wondered if there was a way to reverse the process and turn the PDF into an image file. Remembering PDF.js as another piece of mature software that does PDF rendering, we questioned if there was a way to merge these technologies to get to our solution. After some tinkering, we found the Qt component QWebEngineView that solves our problem. Let’s take a look at how the code works: Based on the QMainWindow class: m_webView = new QWebEngineView(this); m_webView->load(url); setCentralWidget(m_webView);   This will present a window that contains a webpage inside. Now let's have a look at PDF.js. The project is quite large and presents a heavy footprint, but there is an option to build a minfied version that can easily be included into other websites. Here is how we retrieved and built it: $ git clone git:// $ cd pdf.js $ brew install npm $ npm install -g gulp-cli $ npm install $ gulp minified   The result is a “compiled” version of pdf.js in the folder build/minified , then we copy it to our project folder. Now set the url to point to the local file minified/web/viewer.html auto url = QUrl::fromLocalFile(app_path+"/minified/web/viewer.html"); Now build and run: This worked perfectly right out of the box, so our concept is valid. However it is showing their example PDF file, so how do we bypass the filename and inject our own into the javascript engine? There is another clever bit of Qt technology called QWebChannel. The idea is that on C++/Qt side we instantiate QWebChannel object and set this channel to a webpage. With that channel, we can register objects that can be accessed from a JavaScript scope: auto url = QUrl::fromLocalFile(app_path+"/minified/web/viewer.html"); m_communicator = new Communicator(this); m_communicator->setUrl(pdf_path); m_webView = new QWebEngineView(this); QWebChannel * channel = new QWebChannel(this); channel->registerObject(QStringLiteral("communicator"), m_communicator); m_webView->page()->setWebChannel(channel); m_webView->load(url); setCentralWidget(m_webView); The above code allows us to access the communicator object from JavaScript. Now we need to make some changes to viewer.html/viewer.js and add qwebchannel.js to allow communication on other side, but that is simple enough: 1. For viewer.html we just add a reference to qwebchannel.js <script src="qwebchannel.js"></script> 2. For viewer.js we add the initialization of QWebChannel and bypass the filename just below the definition of the original url pointing to that example pdf file: var DEFAULT_URL = 'compressed.tracemonkey-pldi-09.pdf'; new QWebChannel(qt.webChannelTransport ,function(channel) { var comm = channel.objects.communicator; DEFAULT_URL = comm.url;   var DEFAULT_URL = 'compressed.tracemonkey-pldi-09.pdf'; new QWebChannel(qt.webChannelTransport ,function(channel) { var comm = channel.objects.communicator; DEFAULT_URL = comm.url;   Here is the secret sauce in how this all works. Before page loading, we attach a web channel and register the communicator object. Then when viewer.html is loaded for the first time, we have a defined QWebChannel JS class. Just after the declaration of DEFAULT_URL, we create JS QWebChannel. Once it is instantiated and communication is established, an attached function is called which reads the url from communicator object. This url is used instead of the example pdf file. When the PDF.js code changes are done, just rebuild the minified version: $ gulp minified Now we copy the minified to our project home. From here, we can make changes like allowing the application to accept command line arguments or profile a list of available PDF files that you want to process – whatever it is you need. This formed the core of our project for the client because we could now deal with the information inside the PDF much easier. And there you have it, a completed PDF viewer desktop application just in few hours (not counting research). Project github: Article written by Shawn Gordon and Oleksandr Iakovliev Image credit by Qt Want more? For Job Seekers | For Employers | For Influencers
Creativity died, and it’s time to give it a fitting obituary. The conference rooms of agencies and organizations throughout the marketing world have grown silent as our go-to approach to reach our intended target – creativity – is no longer the singular guiding post. Don Draper himself would be horrified to learn he would no longer have the ability to develop an imaginative new idea in a vacuum and force it upon millions who had little recourse other than to patiently wait for the next 30-second ad or flip the page. In the words of TI and Justin Timberlake, that era is officially “dead and gone.” Today CMO’s are not rewarding creativity as much as they are rewarding impact on their target’s behavior. This should come as little surprise given the stream of surveys and research from organizations ranging from Accenture to Gartner that talk about revenue generation primarily falling on the CMO. Late last year, the CMO Council and Deloitte declared, “Nearly 70 percent of CEOs expect CMOs to lead revenue growth with 33% saying revenue generation is the primary mandate of marketing.” Many have been whispering about the health and role of our old friend creativity back to the late-90s when Taco Bell actually saw its sales decline while it’s iconic “Yo Quiero Taco Bell” dog became a cultural moment. No doubt some of today’s marketers would have Facebook Live and Snapchat Stories planned around Gidget (bet you didn’t know that was the name of the Taco Bell Chihuahua) in an attempt to “go viral.” And yet it would be for nothing as today’s CMO would politely ask what that would do to generate sales while the others in the room would stumble over it being an awareness play. Creativity has been replaced by “resonance” "Resonance" is a word I believe you’ll be hearing a lot more about moving forward. The Merriam-Webster definition of resonate states, “to have particular meaning or importance for someone.” It’s that individualist notion that creativity often ignores. To be fair, historically creativity never had the tremendous aterray of data available to really understand what would have meaning to their target. The smallest marketing department today can make use of a tremendous array of free tools such as Google Trends, What’s Trending on Twitter and InfoScout to understand what’s really on their target’s mind. Barriers, opportunities, cultural trends and emerging competitors are all there and updated by the minute in many cases. Each organic search or app download is a signal straight to a marketer’s ears – provided they are listening. A worthy successor Resonance is something I’m especially interested in because it elevates the goal of engagement to a more meaningful level beyond liking or sharing something. When a campaign or individual content unit resonates with someone it causes them to take some type of action and affects their behavior in directly measureable results. A broadcast segment about the latest tech gadget should have someone searching for more information from their smartphone. A personalized, relevant email should lead the recipient to click a link. A paid search ad that directly addresses the searcher’s interests should lead them to click. Even better is that those signals can happen in near real-time with the nearly infinite amount of tracking technologies available. Want to know if your latest media integration with a leading online influencer is resonating? Check your Google Analytics to see not only how many visitors have come in the last 24 hours, but also what percentage were new to the site, their engagement and how much they purchased. Don’t like the results? Change the message, the landing page or even the influencer with a few clicks of the mouse and keyboard. Packaged together you have resonance and a worthy successor to our friend creativity as the aspirational goal for marketers everywhere. Article written by Jeff Bodzewski Image credit by Getty Images, DigitalVision, Dimitri Otis Want more? For Job Seekers | For Employers | For Influencers
Recently, I’ve been frustrated with some of the articles I’ve read about building great analytics dashboards. Their titles often led me to believe that they contained must-have insights about creating dashboards that engage users, but in the end, most are about use of color, pretty charts and the latest feature a particular vendor has launched. And in my experience as a data product builder, these things aren’t what will make your embedded analytics product a success. In the past, I followed the similar rules for dashboard design to those I found in the articles, and yet, we often didn’t generate enough user engagement to persuade the customer to continue paying for analytics. I built dashboards with cutting-edge animations that displayed data in the best possible format, and which looked modern and frankly, beautiful. And I’ve seen many of them fail to engage users in the long term. As I’ve discovered since those early days, the rules of dashboards are different when applied to analytics-based products. "Pretty" simply isn’t enough. “If you build it, they will come…” A great line in a movie, but it often doesn’t hold true, at least for data products. For an analytics product team, it’s not uncommon to “build it” only to have users fail to walk out of that cornfield or to have the users that do arrive head back into the corn after just a few visits. When you’re the person responsible for bringing a product successfully to market, that’s a big problem. Luckily, through the judicious application of mistakes and failed attempts on my own products, I’ve learned a few lessons that help increase data product adoption both in the short-term and over the long haul. Through trial and error, here’s what I found I, like many product owners, was targeting the wrong users with the wrong functionality. I’d often go after the “core user” – the person sitting in front of whatever application into which my analytics would be embedded – thinking this would have the most impact. When adding analytics to a CRM system, I would think about analytics for the salesperson. Order management system? I might focus on analytics for the order entry clerk. While it seemed logical, it failed more often than it succeeded. As I learned the hard way, when you launch your product you need user engagement, you need it fast and you need the right kind of user. Enter the CEO. Why the CEO persona? Now, when I start designing the initial analytics dashboards for a product, I build the CEO dashboards first. Of course, I get to other personas later, but I start with the CEO. Why? Because in addition to being the person that ultimately controls the project budget, the CEO is the person that sets the decision-making “style” for the entire company. Get the CEO hooked on using analytics to drive decisions, and the only way that she’ll give up such a powerful tool is if you pry it away. Solve her problems, and she’ll be your champion and drive adoption throughout the business. Sit in on an executive staff meeting where the CEO knows more than their subordinates about an issue because of dashboards and watch how fast adoption spreads downward. Get inside the mind of the CEO If the success of your data product depends on generating engagement and the CEO is the most effective place to begin to creating that engagement, how do you get started? When it comes to overarching methodology for picking the best analytics for your user personas, you pick a persona, identify their “mission” or work goals, map the workflow they follow and any gaps that exist. The analytics you build should directly correspond to the gaps you’ve found in the workflows for each persona. This is a great process for ensuring that every analytic placed on a dashboard is tied to the needs of a persona, but it doesn’t work well for the CEO. A CEO has such broad responsibilities, such varied “missions,” that creating workflows and finding gaps/pain points is a difficult proposition. So I use another technique: empathy. You begin the process by understanding the traits common to many top executives and then building dashboards that work with, rather than against those characteristics. Here are a couple to consider: Characteristic #1: Extreme time pressure Most CEO’s are busy people. Between customer visits, tackling the problem of the day, interviews and board meetings, it’s amazing that they have any time at all to review operational performance. Like me, you’ve probably had countless meetings scheduled with top executives that get pushed out days or weeks due to various crises or obligations. Time is at a premium for the CEO, and the limited time they do have is vitally important. You can use this to your benefit. Characteristic #2: Anxiety Not the crippling kind of "can't get out of bed" anxiety, but rather the “I have a lot on my plate and none of it is trivial” type of stress. A high level of anxiety is common for CEOs and with good reason. If they make mistakes, they can risk the entire business. People can get laid off, stakeholders can lose money, businesses can fail. A lot is on the line. Additionally, in a larger business the CEO can’t possibly understand all of the key issues at play across all of the many organizations. Due to the scope of the role, she’s operating with incomplete information. You’d be a little anxious, too. But just as a little anxiety can be a good thing when so much is at stake, it can also be a valuable tool for the analytics product leader when designing dashboards. This is the world of the CEO, whether building customer-facing SaaS applications, running a manufacturing business or leading a non-profit. With an understanding of a couple of the pressures that they face, you can start to design dashboards that will reduce their anxiety and time concerns while producing high engagement. Three elements of a great analytics dashboard with the CEO in mind A word of warning: What follows below might not be what you’re expecting. It’s distinctly lacking in information about aesthetics, color theory or optimal chart selection. Though these elements are important, none are key to generating CEO engagement (and hence, revenue). Here’s what is key: 1. Design for triage Imagine a scenario in which you’ve come across the scene of a accident and people have been hurt. Injured victims lie across the ground in front of you and they need help now. Where do you start? For emergency responders, there’s a protocol known as the “ABC’s” in place. You examine each each of the injured for A irway, B reathing and C irculation. Depending on the results, the responder may treat the patient immediately or move on to a more critical situation. This is done in part to relieve some of the extreme time pressure on the emergency responder – take too long, treat the less critical patients first and people can die. The CEO, also under time pressure, has a similar need to identify the most critical “patients”, take action and move on as soon as possible. You can generate CEO engagement by designing your dashboards for easy executive triage. Triage-based dashboards should do the following: Show an overview of the situation Identify the most critical “patients” Indicate how badly each is “injured” With the limited time available, the CEO user must be able to open the dashboard, receive a quick synopsis of key operations via key performance indicators and understand which areas need immediate attention to ensure the health of the business. Unfortunately, many dashboards aren’t designed with CEO engagement in mind and require clicking, scrolling, drilling and interpretation to see the big picture. For the rare occasion when the CEO has the time to spend exploring the analytics, this is fine. For the every day situation where the CEO needs to gain an understanding of performance and move on to the next task, it makes the dashboard frustrating. It’s easier to ask a subordinate to perform some analysis and report back. The time and effort required to triage business health with your dashboard is too high. There goes your engagement… If you want to create an engaging dashboard for the CEO, make sure that it’s designed for easy problem triage with the most critical issues front and center. 2. Design for relationships The CEO role is fundamentally different from the jobs most of us hold. We might have responsibility for a team, a product or for a quota, but in most cases we have limited scope. We can focus our attention on that single team, region or department. The CEO job is different. Imagine if you – the manager of an order processing team – had the following schedule: 8 am: Review yesterday’s performance 9 am: Meet with irate customers who have delayed orders 9:30 am: Conference call with analysts and press to discuss the future of order processing in the 21st century 11 am: Review space planning with the building facilitates to ensure room for future growth 12 pm: Lunch with key investors in order to reassure them that their investment is safe You get the idea. If you had a schedule like this, you’d have a tough time catching your breath, much less achieving an understanding of the strategic needs of the order processing team. From hour to hour, the CEO can be working on highly disparate projects, all of which are important, but for which the common thread tying the activities together might be hard to see. Engaging dashboards will help the CEO to see these patterns. Your CEO analytic dashboards must show not just “snapshots” of performance but a coordinated picture of the interrelationships between key elements of the business. In past years, this might have been displayed as a “balanced scorecard” of metrics from finance, operations, customer satisfaction and employee satisfaction. In recent years with all of the amazing analytical tools at our disposal, we’ve gotten away from the “balanced” approach and now favor dashboards based on data readily available and which can be displayed in a attention-grabbing format. These look nice, but don’t help the CEO look across the business, and therefore, don’t offer an incentive to keep using the dashboards. A big picture dashboard showing cross-organizational relationships does. Here’s what you need to do: Map out (at a high level) the key processes of the business such as “develop new products,” “sell products,” “install products,” “support customers,” etc. Break each key process into 3-5 phases – identify the metrics for each of those phases. These metrics summarize health within the process, but aren’t so deep down as to bog down the CEO. For each key process, create a high-level set of 2-3 summary metrics that captures the overall process health. Combine the summary metrics and the phase metrics into a single “process health” block for your dashboard. Arrange the executive dashboard by these “process health” blocks in order so the CEO can assess health from development, to sales, to deliver and so on. Combined in such a manner, the CEO can view the performance of the business in a logical manner. This enables the understanding of relationships: “it appears that the selling process is going well, but look – we’ve got backlogs over there in the installation area. We’re going to start getting support calls soon. Better focus on that!” Showing individual metrics forces the CEO to mentally build the interconnected-ness themselves. Showing the relationships allows them to manage the business. And that generates engagement. 3. Design for action Triage and understanding of relationships is only valuable if you can do something about the problems you uncover. Years ago, my team built a set of dashboards for executives to show performance of projects around the company. It worked well – over time we were able to shift operational reviews from PowerPoint-based meetings to web-based review calls. But I missed the “take action” part of the equation. There was no tracking of identified issues or tasks designed to bring resolution within our system. Each review of the dashboards would result in an executive assigning actions – verbally – to a manager who would dutifully note the task and then work the solution. Further follow-up was then completely offline, taking the form of emailed status reports or hallway conversations. Not a great way to generate repeat engagement for your data product… Big miss on my part. When building a data product that will be engaging for the CEO, you need to allow them to take action right then and there. This might be in the form of an “Assign Task” button right next to a chart on the dashboard, or it might be a simple message thread below the chart. What it shouldn’t be is an external email link or any other mechanism that forces the CEO out of the system to check future status of the problem. Design your dashboard so the CEO sees the issue, requests an action be taken from within the analytic system, and then returns to the application to learn what progress has been made. Having the dashboard serve as the sole source to check status is a technique I call “loading the itch.” It simulates the CEO to return to your data product to learn what – if anything – has been done to solve the problem that she assigned. Like the drive you feel to keep checking your social media account to see responses and “likes,” the drive for the CEO to return to the analytics will be powerful and the result will be consistent executive engagement. Wrap-up Don’t get me wrong, I really enjoy reading all of those articles about building great dashboards. I love learning about color, gestalt theory and interaction design. But these things aren’t enough for the product team seeking to create a data product that generates revenue. You need to design analytics that get key decision makers – like the CEO – seeking out your dashboards on a daily basis. Understand the CEO, her key motivators and how you can help her better understand the performance of her business, and you’ve got an engaged user, and likely, a new advocate. Article written by Kevin Smith Image credit by Getty Images, Moment, Guido Mieth Want more? For Job Seekers | For Employers | For Influencers
I have been a marketing person multiple times in my career. However, to be honest, I have struggled to sell my marketing brethren on the value of investing in customer data, including protecting the sensitive data contained within it. And this has not been for a lack of great reasons from my CIO friends. Stephen Landry, CIO of Seton Hall University: “Information security is an essential part of customer experience, as Target now knows.” David Chou, CIO of Childrens Mercy Hospital: “Once the trust is gone, forget about all of the digital and innovative experiences you have put forth.” Isaac Sacolick, former CIO of Greenwich Associates: “You can compete on overall customer experience, innovation, performance, service, design – and a big yes to security and trust.” Cynthia Stoddard, CIO of Adobe: “Data is the new currency for organizations driving insights, customer engagement, and ultimately, financials.” My question has been, how do I relay these great arguments in the language of marketing? The answer became obvious recently as I was teaching a marketing class. During a discussion about David A. Aaker’s " Managing Brand Equity ", one my students asked about United Airlines and if this PR disaster could reduce the value of a firm’s brand. Of course, the answer is yes! So, I asked the class to consider, what about a hack that led to the loss of customer data? I heard another yes here. What is brand equity? “Brand equity is a set of brand assets and liabilities linked to a brand, its name and symbol that add or subtract from the value provided by a product or service to a firm and/or to a firm’s customers.” ("Managing Brand Equity", Dave Aaker, page 19) The elements that drive brand equity on the asset or liability side of the equation include: Brand loyalty – are existing customers satisfied with the brand? Brand awareness – are customers familiar in a positive way with the brand? Quality perception – how do customers perceive the overall quality of the brand? Brand association – what values do customers associate with the brand? How would a customer information loss impact brand equity? We would all consider a major loss of customer data as a major business disaster. Aaker covers business disasters under the topic of brand associations saying that a disaster affects image, and thus, brand equity. Aaker says that the best approach is for brand managers to ensure that a disaster is avoided; where one does occur, detect the disaster early and do something about it before it blows up. Aaker goes even further saying that the remedy should be as convincing as possible. This is clearly the tactic taken now by Target. When asked a few weeks ago about the elephant in the room for digital disruption, digital trust, Target’s CIO said, “We have to prove to our customers every day that their data is secure, it can never leave us again.” Another major loss of customer data would have disastrous consequences for the Target brand. With 10 percent of revenues spent on marketing, how much of a hit can brands take before this investment becomes a wash? I would suggest that brand managers can no longer allow their organizations to plead ignorance to the possibility that customers will walk or stay away from a failure to protect customer data. Financial penalties for failure to comply can erode the value achieved by brand investment. Should a portion of the marketing budget be put into customer data protection programs? Personally, I think the answer should be yes here, too. Parting thoughts It is time for marketing to provide better cover for CIOs trying to invest in protecting customer data. A sensitive data loss means losing brand equity; this is the justification that can meet any CFO in cost cutting mode. Further reading Former CIO and current CTO Tom Fountain: On the business value of data protection How should you protect enterprise data? CFOs: Are you responsible for protecting your firm’s corporate data? Twitter: @MylesSuer Article written by Myles Suer Image credit by Getty Images, Corbis, Larry Williams Want more? For Job Seekers | For Employers | For Influencers
Epic Sciences  completed last week a $40 million Series D financing led by Hermed Capital. Epic is developing a portfolio of blood-based tests that predict drug response in cancer and recently partnered with Genomic Health to commercialize the OncotypeDx® AR-V7 Nucleus Detect™ test to leverage Genomic Health's commercial channel and enterprise systems. Altos Capital Partners, Domain Associates, Genomic Health, Pagoda Investment, Reach Tone Limited, RMI Partners, Sabby Capital and VI Ventures also participated in the financing. Headquartered in San Diego, the company was founded on a powerful platform to identify and characterize rare cells, including circulating tumor cells (CTCs). "We believe Epic Sciences' OncotypeDx AR-V7 Nucleus Detect test and its robust pipeline of predictive tests in oncology are truly unique in the industry," said Jerry Xiao, Ph.D., managing director of Hermed Capital. "These tests, enabled by Epic's sensitive, scalable and distributable rare cell detection technology, have the potential to revolutionize cancer care globally. We look forward to assisting Epic in exploring the Chinese market and commercializing its CTC platform and No Cell Left Behind® technology." Epic plans to use the funds to accelerate clinical studies for oncology tests in the company's pipeline as well as enhance its proprietary No Cell Left Behind technology to include characterization of rare leukocyte cell populations with digital imaging and big data analytics. These enhancements are designed to drive insights into the cellular drivers of response or resistance to key drug classes such as immuno-oncology drugs. "We are extremely pleased to have such a strong and diverse international syndicate that encompasses thought leaders from the venture capital and pharma communities, diagnostic strategics, innovators from the technology space and best in class partners in the Chinese market," said Murali Prahalad, Ph.D., president and CEO of Epic Sciences. "This funding will help advance our mission to provide physicians and patients with the most predictive, precise and personalized information to guide cancer therapy." Working with its partners, Epic seeks to significantly improve clinical outcomes for patients through the development of novel predictive tests that help physicians provide therapies individualized to a patient's unique tumor biology. Epic has published clinical data in the journals JAMA Oncology and European Urology on the OncotypeDx AR-V7 Nucleus Detect test in metastatic castration-resistant prostate cancer. These data show that patients with detectable levels of AR-V7 positive circulating tumor cells in their blood had significantly better clinical outcomes when treated with taxane chemotherapy than more expensive targeted therapies. Once this test is commercialized, it will help physicians and patients decide when to proceed with taxane chemotherapy or androgen directed therapeutics. It is estimated that each year, about 50,000 mCRPC patients could benefit from knowing their AR-V7 status prior to selecting further treatment. "Our strategic collaboration with Epic Sciences allows us to leverage Genomic Health's successful commercial channel to help address a significant unmet need in the management of late-stage prostate cancer," said Frederic Pla, Ph.D., chief business and product development officer, Genomic Health. "Epic Sciences' industry leading CTC-based platform combined with our Oncotype IQ™ Genomic Intelligence Platform will help further our offerings to oncologists and urologists." Article published by Anna Hill Image credit by Getty Images, Cultura, Andrew Brookes Want more? For Job Seekers | For Employers | For Influencers
Just a year after its launch, Toyota Connected is planning to double its headcount and expand its office space at the Legacy West Urban Village, adjacent to Toyota Motor North America's new headquarters. Toyota Connected is the data science hub for Toyota's global operations and supports a broad range of consumer-, business- and government-facing initiatives. It was created to significantly expand Toyota's capabilities in the fields of vehicle data science, machine learning and contextual data services development throughout Toyota's global operations. Founded in April 2016, the startup initially planned to employ 100 and has already hired about 55 technologists, including data scientists, engineers and software developers. It now plans to employ 200. Toyota Connected also is adding 13,500 square feet to its existing 20,000-square-foot lease in the Urban Village office block in Plano. "The amount of interest and demand in leveraging our big data to create new products and services has been incredible," said Zack Hicks, Chief Executive Officer of Toyota Connected, and Chief Information Officer at Toyota Motor North America. "Over the past year we've been scaling up to deliver on these new opportunities, which will significantly impact the way consumers interact with vehicles." Toyota Connected has begun to develop Toyota's Mobility Services Platform, a global, cloud-based digital ecosystem that will enable a host of mobility services, including ride sharing, car sharing and remote delivery. Working closely with affiliate Toyota and Lexus Financial Services, trials are also being conducted for flexible leasing and innovative payment features that will enable new and innovative uses for vehicles as the burgeoning sharing economy takes shape. "Our mobility platform offers more functionality within the vehicle itself, but it goes way beyond that," Hicks said. "We're exploring services to manage fleets, expand ride-sharing capabilities and leverage big data to contextually and intuitively improve services for our customers to deliver an amazing experience with our Toyota and Lexus brands." Toyota Connected is delivering these wireless services using Microsoft's cloud-based Azure platform. A division of Toyota Motor Corp., Toyota Connected provides a range of data and computer science services across operations, including support for ongoing research into artificial intelligence and robotics and the Toyota Research Institute. Article published by Anna Hill Image credit by Toyota Connected Want more? For Job Seekers | For Employers | For Influencers
Ever since I first heard of Hadoop and researched it, it seemed like a poor solution. Way too much work, detached data, not real time, reliant on IT to put their queries together, etc. Sisense saw this issue, as well, but they addressed it in a totally different, two-pronged approach.This isn’t a product review, but rather an overview of the technology and possibilities. Sisense is a provider of Business Intelligence (BI) technology, that includes a back-end powered by “in-chip” technology that easily enables non-techies to access and analyze large data sets from multiple sources, and a front-end for creating dashboards and reports that will display on any device, including mobile. I’m going to focus on the former. BI applications typically process an extraordinarily large amount of data to provide useful feedback in an easily digestible fashion. Therefore, they tend to be fairly slow as there is a constant stream of data from disk to memory to the CPU. Many vendors process as much in RAM as possible to speed things up, but this requires lots and lots of RAM. To make more efficient use of the RAM, pretty much all the BI vendors are using columnar databases as the staging area, this is about as efficient as you can get with your result set as you drop the data you don’t need. This is where Sisense diverges from the crowd and how they declare they can process 100 times the data at 10 times the speed of the competition . Sisense leaves that data on the disk instead of trying to jam it into RAM, then compresses the heck out of it. Now here is where their secret sriracha sauce comes in: They do the decompression in the CPU cache. As you can see from the graphic above, even the L3 cache of the CPU is orders of magnitude faster than even doing it in RAM, and because the data is all highly compressed, it is moving off disk and through RAM at a significantly faster pace than could normally be achieved. The application doing all this work is Prism, and it doesn’t stop there. Prism is holding a memory map of the current location of all data. When it processes data or does any type of calculation, it is applying vector algebra to the data, thus enabling Prism to take advantage of the x86 in-chip Single Instruction Multiple Data (SIMD) vector instructions. This allows short arrays of data to be processed by a single instruction. As a result, the CPU cores are able to process data much faster and in parallel. Prism is designed to keep the CPU as active as possible. It is the fastest piece of hardware in your box and is often underutilized, but Prism changes all that. The learning algorithm that is built in to Prism also means that over time it will even get faster, and it more intelligently optimizes and pre-fetches data. This also means that, counterintuitively, it can get faster with more people using it because pre-loading optimizations improve as more queries are performed. Back to my opening statement about Hadoop. Sisense and Prism will work with Hadoop and map reduce data. However, the whole distributed and difficult nature of Hadoop evolved around having to manage insanely large amounts of data and no realistic way to do it in real time. With Sisense, you are able to crunch through terrabytes of data with a large number of concurrent users, on a single commodity server without having a team of IT guys constantly managing it. CPUs just get more and more powerful, with more cores and presumably, more cache. The technology that Sisense has created is about as durable as it gets in terms of future proofing. Nearly three quintillion bytes of data are created every day ; 80 percent of it is unstructured, and only 20 percent of it is available to be processed. The need for tools to effectively dig through all that data and present useful results is clear, and there are many vendors providing them, but Sisense, in my opinion, is genuinely addressing it the way it needs to be addressed. Article written by Shawn Gordon Image credit by Sisense Want more? For Job Seekers | For Employers | For Influencers
SAN FRANCISCO (Reuters) – Microsoft Corp is rolling out upgrades to its sales software that integrates data from LinkedIn, an initiative that Microsoft CEO Satya Nadella said was central to the company's long-term strategy for building specialized business software. The improvements to Dynamics 365, as Microsoft's sales software is called, are a challenge to market leader and represent the first major product initiative to spring from Microsoft's $26 billion acquisition of LinkedIn, the business-focused social network. The new features will comb through a salesperson's email, calendar and LinkedIn relationships to help gauge how warm their relationship is with a potential customer. The system will recommend ways to save an at-risk deal, like calling in a co-worker who is connected to the potential customer on LinkedIn. The enhancements, which will be available this summer, will require Microsoft Dynamics customers to also be LinkedIn customers. The artificial intelligence capabilities of the software would be central, Nadella said. "I want to be able to democratize AI so that any customer using these products is able to, in fact, take their own data and load it into AI for themselves," he said. While Microsoft is a behemoth in the market for operating systems and productivity software like Office, it is a small player in sales software. The company ranks fourth – far behind and other rivals Oracle and SAP – with just 4.3 percent of the market in 2015, the most recent year for which figures are available, according to research firm Gartner. Salesforce declined to comment on Microsoft's competing software. But Nadella said specialized applications in fields like sales and finance are critical to the company's future. He bills them as Microsoft's "third cloud," the first two being Office 365 for general productivity like email and Azure for computing and databases. Nadella's bigger vision is to have all products take advantage of a common set of business data that can be mined for new insights with artificial intelligence. "I think that's the only way to long-term change this game, because if all we did was replace somebody else's (sales), or (finance) application, that's of no value, quite frankly," he said. Microsoft pointed to Visa as a success story. Earlier this year Visa was in the process of choosing a cloud-based customer service software system and picked Microsoft's over Salesforce. Rajat Taneja, executive vice president of technology at Visa, said Nadella's three-cloud strategy was the deciding factor. But Microsoft has a long way to go. The company has never released a revenue figure for Dynamics, though the former head of Dynamics said publicly in 2015 that it was a $2 billion business unit. That compares with Salesforce revenue of $8.3 billion overall and $3 billion for its sales software specifically. Dynamics also grew more slowly than Salesforce last year – Dynamics revenue grew just 4 percent versus 26 percent for Salesforce, which is also rolling out artificial intelligence features. Nadella is under pressure to show that the pricey LinkedIn acquisition in mid-2016 was worthwhile. R "Ray" Wang, founder of analyst firm Constellation Research, said LinkedIn-powered features, combined with popular programs like Office and Skype, could help. "Microsoft is putting together the contextual business data people need to be more efficient and build better relationships," Wang said. Nadella said Microsoft will also continue offering certain LinkedIn data to other companies, including Salesforce, as LinkedIn did before its acquisition. Salesforce had urged European regulators to probe the Microsft-LinkedIn deal, which they ultimately declined to do. "That ecosystem approach is something that we will absolutely maintain and, in fact, if anything keep continuing to evangelize," Nadella said. Reporting by Stephen Nellis; Editing by Jonathan Weber and Mary Milliken Image credit by Getty Images, NurPhoto Want more? For Job Seekers | For Employers | For Influencers