Transcript of a discussion on how the latest research and products bring the power of people and machine intelligence closer together to make analytics consumable across more business processes.
Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re
listening to BriefingsDirect. Our
next business
intelligence (BI) trends discussion explores the latest research and products that bring the power of people and
machine intelligence closer together.
Gardner |
As more data becomes
available to support augmented
intelligence -- and the
power of analytics platforms increasingly goes to where the data is -- the next
stage of value is in how people can interact with the results.
Stay with us now as we
examine the latest strategies for not only visualizing data-driven insights but making them conversational and even presented through a form of storytelling.
To learn more about making
the consumption and refinement of analytics delivery an interactive exploit
open to more types of users, we are now joined by Elif Tutuk, Head of Research at Qlik. Welcome to BriefingsDirect.
Elif Tutuk: Thank you. It’s a great pleasure to be
here.
Gardner: Strides have been made in recent years for better accessing data and making it available to analytics platforms, but the democratization of the results and making insights consumable by more people is just beginning. What are the top technical and human interaction developments that will broaden the way that people interact differently with analytics?
Trusted data for all
Tutuk: That’s a great question. We are doing a
lot of research in this area in terms of creating new user experiences where we
can bring about more data literacy and help improve people’s understanding of
reading, analyzing, and arguing with the data.
Tutuk |
In terms of the user
experience, a conversational aspect has a big impact. But we also believe that it’s
not only through the conversation, especially when you want to understand data.
The visual exploration part should also be there. We are creating experiences
that combine the unique nature, language, and visual exploration capabilities
of a human. We think that it is the key to building a good collaboration
between the human and the machine.
Gardner: As a result, are we able to increase
the number and types of people impacted by data by going directly to them -- rather
than through a data scientist or an IT department? How are the interaction
elements broadening this to a wider
clientele?
Tutuk: The idea is to make analysis available from
C-level users to the business end users.
If you want to broaden the
use of analytics and lower the barrier, you also need to make sure that the
data machines and the system are governed and trusted.
Our enterprise data
management strategy therefore
becomes important for our Cognitive
Engine technology. We
are combining those two so that the machines use a governed data source to
provide trusted information.
Gardner: What strikes me as quite new now is more
interaction between human cognition and augmented intelligence. It’s almost a
dance. It creates new types of insights, and new and interesting things can happen.
How do you attain the right
balance in the interactions between human cognition and AI?
Tutuk: It is about creating experiences between
what the human is good at -- perception, awareness, and ultimately decision-making
-- and what the machine technology is good at, such as running algorithms on large
amounts of data.
As the machine serves insights
to the user, it needs to first create trust about what data is used and the context
around it. Without the context you cannot really take that insight and make an
action on it. And this is where the human part comes in, because as humans you
have the intuition and the business knowledge to understand the context of the insight.
Then you can explore it further by being augmented. Our vision is for making
decisions by leveraging that [machine-generated] insight.
Gardner: In addition to the interactions, we are
hearing about the notion of storytelling. How does that play a role in ways
that people get better analytics outcomes?
Storytelling insights support
Tutuk: We have been doing a lot of research
and thinking in this area because today, in the analytics market, AI is becoming
robust. These technologies are developing very well. But the challenge is that most
of the technologies provide results like a black box. As a user, you don’t know
why the machine is making a suggestion and insight. And that creates a big
trust issue.
To have greater adoption of
the AI results, you need to create an experience that builds trust, and that is
why we are looking at one of the most effective and timeless forms of communication
that humans use, which is storytelling.
To have greater adoption of the AI results, you need to create an experience that builds trust, and that is why we are looking at one of the most effective and timeless forms of communication that humans use, which is storytelling.
So we are creating unique
experiences where the machine generates an insight. And then, on the fly, we
create data stories generated by the machine, thereby providing more context.
As a user, you can have a great narrative, but then that narrative is expanded
with insightful visualizations. From there, based on what you gain from the
story, we are also looking at capabilities where you can explore further.
And in that third step you
are still being augmented, but able to explore. It is user-driven. That is
where you start introducing human intuition as well.
And when you think about the
machine first surfacing insights, then getting more context with the data story,
and lastly going to exploration -- all three phases can be tied together in a
seamless flow. You don’t lose the trust of the human. The context becomes really
important. And you should be able to carry the context between all of the
stages so that the user knows what the context is. Adding the human intuition
expands that context.
Gardner: I really find this fascinating because
we are talking not just about problem-solution, we are talking about problem-solution-resolution,
then readjusting and examining the problem for even more solution and resolution.
We are also now, of course, in the era of augmented reality, where we can bring these types of data
analysis outputs to people on a factory floor, wearing different types of visual
and audio cue devices.
So the combination of
augmented reality, augmented intelligence, storytelling, and bringing it out to
the field strikes me as something really unprecedented. Is that the case? Are
we charting an entirely new course here?
Tutuk: Yes, I think so. It’s an exciting
time for us. I am glad
that you pointed out the augmented reality because it’s another research area
that we are looking at. One of the research projects we have done augments people
on retail store floors, the employees.
The idea is, if you are
trying to do shelf arrangement, for example, we can provide them information --
right when they look at the product – about that product and what other products
are being sold together. Then, right away at that moment, they are being
augmented and they will make a decision. It’s an extremely exciting time for
us, yes.
Gardner: It throws the idea of batch-processing out
the window. You used to have to run the data, come up with report, and then adjust
your inventory. This gets directly to the interaction with the end-consumer in
mind and allows for entirely new types of insights and value.
Tutuk: As part of that project, we also allow
for being able to pin things on the space. So imagine that you are in a warehouse,
looking at a product, and you develop an interesting insight. Now you can just
pin it on the space on that product. And as you do that on different products, you
can take a step back, take a look, and discover different insights on the
product.
The idea is having a tray
that you carry with you, like your own analytics coming with you, and when you
find something interesting that matches with the tray – with, for example, the
product that you are looking at -- you can pin it. It’s like having a virtual
board with products and with the analytics being augmented reality.
Gardner: We shouldn’t lose track that we are often
talking about billions of rows of data supporting this type of activity, and
that new data sets can be brought to bear on a problem very rapidly.
Putting data in context with AI2
Tutuk: Exactly, and this is where our Associative Big
Data Index technology comes
into play. We are bringing the power of our unique associative engine to massive datasets. And, of course,
with the latest
acquisition that we
have done with Attunity, we gain data streaming and real-time analytics.
Gardner: Digging down to the architecture to
better understand how it works, the Qlik
cognitive engine
increasingly works with context awareness. I have heard this referred to as AI2.
What do you all mean by AI2?
Tutuk: AI2 is augmented intelligence powered by an
associative index. So augmented intelligence is our vision
for the use of artificial intelligence, where the goal is to augment the
human, not to replace them. And now we are making sure that we have the unique
component in terms of our associative index as well.
Allow me to explain the
advantage of the associative index. One of the challenges for using AI and
machine learning is bias. The system has bias because it doesn’t have access to
all of the data.
With the associative index, our technology provides a system with visibility to all of the data at any point, including the data that is associated with your context, and also what's not associated. That part provides a good learning source for the algorithms that we are using.
For example, you maybe are trying to make a prediction for churn analysis in the western sales region. Normally if you select the west region the system -- if the AI is running with a SQL or relational database -- it will only have access to that slice of data. It will never have the chance to learn what is not associated, such as the customers from the other regions, to look at their behavior.
With the associative
index, our technology provides a system with visibility to all of the data
at any point, including the data that is associated with your context, and also
what’s not associated. And that part that is not associated provides a good
learning source for the algorithms that we are using. This is where we are
differentiating ourselves and providing unique insights to our users that will
be very hard to get with an AI tool that works only with SQL and relational
data structures.
Gardner: Not only is Qlik is working on such
next-generation architectures, you are also undertaking a larger learning
process with the Data
Literacy Program to, in a sense, make the audience more receptive to the
technology and its power.
Please explain, as we move
through this process of making intelligence accessible and actionable, how we can
also make democratization of analytics possible through education and culturally
rethinking the process.
Data literacy drives cognitive engine
Tutuk: Data literacy is important to help make
people able to read, analyze, and argue with the data. We have an open program --
so you don’t have to be a Qlik customer. It’s now available. Our goal is to
make everyone data literate. And through that program you can firstly
understand the data literacy level of your organization. We have some free tests
you can take, and then based on that need we have materials to help people to
become data literate.
As we build the technology, our vision with AI is to make the analytics platform much easier to use in a trusted way. So that’s why our vision is not only focused on prescriptive probabilities, it’s focused on the whole analytics workflow -- from data acquisition, to visualization, exploration, and sharing. You should always be augmented by the system.
We are at just the beginning
of our cognitive framework journey. We introduced
Qlik Cognitive Engine last year, and since then we have exposed more features
from the framework in different parts of the product, such as on the data preparation.
Our users, for example, get suggestions on the best way of associating data
coming from different data sources.
And, of course, on the
visualization part and dashboarding, we have visual insights, where the
Cognitive Engine right away suggests insights. And now we are adding natural
language capabilities on top of that, so you can literally conversationally
interact with the data. More things will be coming on that.
Gardner: As an interviewer, as you can imagine,
I am very fond of the Socratic
process of questioning and then reexamining. It strikes me that what you are
doing with storytelling is similar to a Socratic learning process. You had an
acquisition recently that led to the Qlik Insight Bot,
which to me is like interviewing your data analysis universe, and then being
able to continue to query, and generate newer types of responses.
Tell us about how the
Qlik Insight Bot works and why that back-and-forth interaction process is
so powerful.
Tutuk: We believe any experiences you have
with the system should be in the form of a conversation, it should have a conversational
nature. There’s a unique thing about human-to-human conversation – just as we
are having this conversation. I know that we are talking about AI and analytics.
You don’t have to tell me that as we are talking. We know we are having a conversation
about that.
That is exactly what we have
achieved with the Qlik Insight Bot technology. As you ask questions to the Qlik
Insight Bot, it is keeping track of the context. You don’t have to reiterate
the context and ask the question with the context. And that is also a unique
differentiator when you compare that experience to just having a search box,
because when you use Google, it doesn’t, for example, keep the context. So that’s
one of the important things for us to be able to keep -- to have a conversation
that allows the system to keep the context.
Gardner: Moving to the practical world of
businesses today, we see a lot of use of Slack and Microsoft Teams. As
people are using these to collaborate and organize work, it seems to me that
presents an opportunity to bring in some of this human-level cognitive
interaction and conversational storytelling.
Do you have any examples of
organizations implementing this with things like Slack and Teams?
Collaborate to improve processes
Tutuk: You are on the right track. The goal is
to provide insights wherever and however you work. And, as you know, there is a
big trend in terms of collaboration. People are using Slack instead of just
emailing, right?
So, the Qlik Insight Bot is
available with an integration to Microsoft Teams, Slack, and Skype. We know this is where the
conversations are happening. If you are having a conversation with a colleague
on Slack and neither of the parties know the answer, then right away they can
just continue their conversation by including Qlik Insight Bot and be powered
with the Cognitive Engine insights that they can make decisions with right away.
Gardner: Before we close out, let’s look to the
future. Where do you take this next, particularly in regard to process? We also
hear a lot these days about robotic process
automation (RPA). There is a lot of AI being applied to how processes can
be improved and allowing people to do what they do best.
The Qlik insight Bot is available with an integration to Microsoft Teams, Slack, and Skype. We know this is where the conversations are happening. They can just continue their conversation by including the Qlik Insight Bot and be powered with the Cognitive Engine insights that they can make decisions with.
Do you see an opportunity for the RPA side of AI and what you are all doing with augmented intelligence and the human cognitive interactions somehow reinforcing one another?
Tutuk: We realized with RPA processes that there
are challenges with the data there as well. It’s not only about the human and
the interaction of the human with the automation. Every process automation
generates data. And one of the things that I believe is missing right now is to
have a full view on the full automation process. You may have 65 different
robots automating different parts of a process, but how do you provide the
human a 360-degree view of how the process is performing overall?
A platform can gather associated
data from different robots and then provide the human a 360-degree view of what’s
going on in the processes. Then that human can make decisions, again, because
as humans we are very good at making decisions by seeing nonlinear connections.
Feeding the right data to us to be able to use that capability is very
important, and our platform provides that.
Gardner: Elif, for organizations looking to take
advantage of all of this, what should they be doing now to get ready? To set
the foundation, build the right environment, what should enterprises be doing
to be in the best position to leverage and exploit these capabilities in the
coming years?
Replace repetitive processes
Tutuk: Look for the processes that are repetitive.
Those aren’t the right places to use unique human capabilities. Determine those
repetitive processes and start to replace them with machines and automation.
Then make sure that whatever
data that they are feeding into this is trustable and comes from a governed
environment. The data generated by those processes should be governed as well.
So have a governance mechanism around those processes.
I also believe there will be
new opportunities for new jobs and new ideas that the humans will be able to
start doing. We are at an exciting new era. It’s a good time to find the right places
to use human intelligence and creativity just as more automation will happen
for repetitive tasks. It’s an incredible and exciting time. It will be great.
Gardner: These strike me as some of the most
powerful tools ever created in human history, up there with first wheel and other
things that transformed our existence and our quality of life. It is very
exciting.
I’m afraid we will have to
leave it there. You have been listening to a sponsored BriefingsDirect
discussion on the latest research and products that bring the power of people
and augmented intelligence closer than ever.
And we have learned about
strategies for not only visualizing data-driven insights but making them
conversational -- and even presented through storytelling. So a big thank you
to our guest, Elif Tutuk, Head of Research at Qlik. Thank you very much.
Tutuk: Thank you very much.
Gardner: And a big thank you to our audience as well for joining this BriefingsDirect business intelligence trends discussion. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host throughout this series of Qlik-sponsored BriefingsDirect interviews.
Thanks
again for listening. Please pass this along to your IT community, and do come
back next time.
Transcript of a discussion on how the latest research and
products bring the power of people and machine intelligence closer together to
make analytics consumable across more business processes. Copyright Interarbor
Solutions, LLC, 2005-2019. All rights reserved.
You may also be interested in:
- Qlik’s CTO on why the cloud data diaspora forces businesses to rethink their analytics strategies
- How real-time data streaming and integration set the stage for AI-driven DataOps
- How a Business Matchmaker Application Helps SMBs Impacted by Natural Disasters Gain New Credit
- The New Procurement Advantage-How Business Networks Generate Multi-Party Ecosystem Solutions
- How Data-Driven Business Networks Help Close the Digital Transformation Gap
- Building the Intelligent Enterprise with Strategic Procurement and Analytics
- How SMBs impacted by natural disasters gain new credit thanks to a finance matchmaker app
- The new procurement advantage: How business networks generate multi-party ecosystem solutions
- SAP Ariba's chief data scientist on how ML and dynamic processes build an intelligent enterprise
No comments:
Post a Comment