Nakala Analytics Ltd - Blog https://nakala-analytics.co.ke/blog-list Thu, 11 Sep 2025 02:10:50 +0300 Joomla! - Open Source Content Management en-gb info@nakala-analytics.co.ke (Nakala Analytics) Non-invasive Blood Analysis using AI https://nakala-analytics.co.ke/blog-list/non-invasive-blood-analysis-using-ai https://nakala-analytics.co.ke/blog-list/non-invasive-blood-analysis-using-ai                                            detect low glucose levels

 

Blood analysis, also known as blood test, is the analysis of blood components. A healthcare provider may order blood analysis to help diagnose diseases such as diabetes, cancer, and viruses, to know whether body organs such as kidney, heart, and liver are working, to detect health problems in early stages, to track how well you are managing health conditions.

Normally when carrying out blood analysis, first your blood sample is collected, and then different tests are performed on the collected blood sample. But what if blood analysis could be done by just a simple scan of the skin? No need for blood collection.

Well, Bloods.ai is creating a technology that would enable just that, non-invasive blood analysis. This technology will use machine learning models to classify the level of specific chemical compounds in samples from their spectroscopic data. It is the fusion of spectroscopy, blood analysis, and artificial intelligence.

When a light beam passes through a sample, each compound in the sample absorbs or transmits light over a certain wavelength. Different compounds absorb best at different wavelengths. Thus, if you use a beam of light containing a range of wavelengths, you can measure the amount of energy absorbed for each wavelength. Such a measurement over different wavelengths (or frequencies) is called a spectrum (or spectral data). This technology will make use of spectral data from the Near Infra-Red (NIR) wavelengths. This is because NIR has the highest penetration power and goes deep into human tissues as compared to the other wavelengths.

Just like other mobile health technologies, this technology will extend the reach of healthcare beyond traditional clinical settings. A model will be incorporated into devices that will allow you to do your blood analysis in less than a minute, even at home. “We can make blood analysis to be a commodity much the same as measuring our weight,” says Oded Daniel, CEO, and co-founder of bloods.ai. “Weight is an indicator that tells a lot about our lifestyle, so imagine what is going to happen if we give people the ability to tap into those 2000 other compounds that are in our blood in the same level of ease as we weigh ourselves at home.”

Oded says that the whole project is in three stages. The first stage is the data collection stage, the second stage involves building machine learning models using the data collected and the last stage is the deployment stage. Currently, the second stage is in progress. The data scientists are working to build machine learning models and as of now, the best model has an accuracy of 0.9. For the first stage which is the data collection stage, a collection system was deployed where about 60 to 70 health institutes around the world send their bio-data. After which extensive datasets are created for data scientists to create models. The data collection is a continuous process and is still ongoing even now.

The final stage is planned to take place in two phases. The first phase will be to create a self-testing kiosk that contains the testing devices where you will be able to get your blood analysis done. These kiosks will be in public places such as stores, supermarkets, gyms, restaurants, and airports. Ideally, this technology is meant to create lifestyle awareness. So, apart from getting your blood analysis done, you will also get lifestyle recommendations based on the results from the blood analysis. For example, a kiosk machine which is located in a supermarket gives recommendations on what groceries you should buy based on your cholesterol levels measured by just a simple scan. The second phase will be to bring the blood analysis to the homes. This will involve making the application to be available on your personal devices such as phones and watches. Just like the self-testing kiosks, you will get your blood analysis done, only that this time you will be able to get it done from anywhere, even at home. 

According to Oded, this technology will not only be able to create lifestyle awareness but also be able to predict the early stages of diseases. In addition to giving recommendations on your lifestyle, it will also be able to indicate any possibility of a disease and if you need to visit a doctor and seek further consultation.

Inside the blood are many compounds, each and every one of them having a certain effect or influence on our health. This technology will be the first of its kind to tap into this vast knowledge, explains Oded.

 

Purity, Nakala Analytics

]]>
info@nakala-analytics.co.ke (Peter) Blog Fri, 11 Feb 2022 11:23:31 +0300
Data science crash course for business https://nakala-analytics.co.ke/blog-list/data-science-crash-course-for-business https://nakala-analytics.co.ke/blog-list/data-science-crash-course-for-business Overview

Leaders today manage organizations operating in highly complex, dynamic, and globally competitive business environments. In this age of data, the world is becoming more data-driven by the day. Scientific data analysis has become all-pervasive, making it one of the fastest-growing and most profoundly essential fields in management. Managers need to learn how to interpret their data to make decisive decisions. Operational data must maximize your impact while reducing the work required to achieve strategic objectives. The ability to use internal and external data is a skill that must help us to solve business problems and avoid the insanity of repeating the same action while expecting different results. Only organization leaders with a razor-sharp capability to consume and interpret data-rich environments and precisely translate that into strategic, operational decisions will successfully command future industry leadership and competitive dominance. Our world-class offering is designed for business professionals seeking to be agents of transformative change within their organizations.

Learning Outcomes.

At the end of this course, learners will gain the ability to:

  1. Learn the modern way to extract data from relational databases.  
  2. Learn modern ways to explore, prepare and assess the quality of the data using Python and present data using Tableau.
  3. Learn how to use basic to advanced analytics techniques for business forecasting, recommendation, and customer segmentation.
  4. Quickly and easily use actionable insights to improve decision making

Toolkit.

  • SQL, Tableau, Python/R, Postgres

Course Target Group.

Understanding how data science techniques can be applied influences successful strategic decision-making is crucial for every analyst in any modern-day organization. This training course is essential for professionals with an interest in data science and could be well suited for:

  1. Operations.
  2. Accountants. 
  3. HR.
  4. Management.
  5. All business professionals.

Course Schedule

Week 1: Data extraction & data inventory assessment using SQL.

Week 2: Exploratory data analysis using Tableau & Take-home project

Week 3: Introduction to data analytics models using Python & Take-home project

Week 4: Machine learning (Predictive, associative & segmentation models) & Capstone project 

Training Methodology

This training course will combine instructor-led presentations with interactive discussions between participating delegates and their interests. It is presented in a very hands-on way to suit individuals with varying levels of knowledge and experience. In addition, practical exercises, video material, and case studies will stimulate and support these discussions to maximize the participants. Above all, the course facilitators will extensively use case examples and case studies based on real-life strategic issues and situations in which he has been personally involved.

Follow up procedure

We follow up with the learners within one month after training to ensure the impacted skills are practiced.

Course Duration, Location & Investment

 

Duration:           4 Weeks (4 Hours per week).

Venue:               Remote (Evening Classes & Weekends).

Investment:       Ksh. 23,000 per head / Ksh. 40,000 for groups of 2

 

Registration

If you have any interest in this, please CLICK HERE to register to book your place with us, thanks!

 

 

 

]]>
info@nakala-analytics.co.ke (Peter) Blog Fri, 04 Feb 2022 13:44:25 +0300
Benefits of outsourcing your big data initiative https://nakala-analytics.co.ke/blog-list/10-benefits-of-outsourcing-your-big-data-initiative https://nakala-analytics.co.ke/blog-list/10-benefits-of-outsourcing-your-big-data-initiative

Market research report search engine predict market for data analytics, outsourcing to be valued at staggering $20 billion by 2026, and of CACGT of 29.4%. These reports mark the importance of outsourcing of data analytics in the upcoming few years! So, let us examine the benefit one could reap through outsourcing your big data initiative.

Outsourcing is a business practice in which service or job functions are framed out to third party. In Big data initiative it focuses on wide range of operations such as big data processing, analysis, storage and management. Companies may choose a service on shore (within same country), near shore (to other countries lies in same time zone) or offshore (to more distant country).

It is no longer a question of “IF” you should incorporate big data into your operations. What your business should be looking out for if “How to get started”.

Recruiting big data analytics team can slow down company’s development on many different levels. That means hiring top big data experts may force drastic budget cuts in other departments especially during these tough economic times. Cutting down your budget for key services will definitely have an effect on your company’s overall profitability and outsourcing big data tasks is definitely the best solution.

Reasons why you should outsource your big data initiative?

1. Win by all means

Consultants are specialized and dedicated to value delivery away from normal operational disruptions. With big data consultants, there is no chance for delay, you must win either way.

2. Room for innovation & operational disruption

Outsourcing of data makes way for innovative and creative ways to interpret data by special analytics.

3. Save cost and time

Save on cost of hiring, on-boarding, and training a new hire. It takes at least 6 months to get a good champ in data analytics. Their attrition rate is high

4. Quality of work

Consultants adopt best practices and follow through individual deliverables as per their mandate. You can be assured of a self-push to delivery.

5. Solves the problem of hiring wrong candidate

It is the responsibility of the consulting firm to seek for best skills while ensuring timely delivery.

6. Higher success ratio

Outsourcing provides higher success ratio than traditional methods of hiring.

To make it work, be certain of your needs. Clearly state the need for having a dedicated outsourced analytics team. The output/deliverables must be precise, clear and ones that truly inform your decision-making.

Why Outsource Your Analytics to Nakala!

 

]]>
info@nakala-analytics.co.ke (Nakala Analytics) Blog Fri, 10 Jul 2020 10:26:13 +0300
Big data will play key role post COVID19 https://nakala-analytics.co.ke/blog-list/big-data-will-play-key-role-post-covid19 https://nakala-analytics.co.ke/blog-list/big-data-will-play-key-role-post-covid19

Data is the new science. Big data holds the answers. Pat Gelsinger.

According to DOMO's report “Over 2.5 quintillion bytes of data are created every single day.

And by 2020 it is estimated that 1.17MB of data will be created every second for a person on Earth”.In the 21st century, data is the new currency.

Photo by Negative Space from Pexels

Post COVID19 we will witness a new global order of data orientation which creates the significance of Big Data in the world.

So, let’s have a look what is big data all about? Big Data is a pool of large amount of data, which is structured or unstructured. With orientation and software techniques it can be managed efficiently for various useful purposes.

Big Data is characterized by Six V's-

In most of the countries of the world whether developed or developing has two broad great challenge in the upcoming post COVID19 world that is:

1) Vulnerability to health or health issues

2) Arising unemployment Does the big data has the answer for it? The answer is yes!

AETNA - Looks at patient results on a series of metabolic syndrome detecting tests assess patient risk factors and focus in treating one or two things that will have the most impact on improving their health. 90% of patients who didn’t have a previous visit with their doctor would benefit from screening and 60% would benefit from improving their adherence to their medicine regimen.

EVOVL- Helps large companies to make hiring and management decision through analytics. These two examples envisage how the post COVID19 world can be tackled efficiently. Now a question again arises, why only big data?

Firstly, the pandemic all over the world has tremored away the vibrant giant economies which needs immediate revival to normalcy.

Secondly, big data can play the role of game changer as being efficient and on time implementation of various projects by the government for example the accumulation of data who lost job in pandemic with skills to pursue so that they all can be arranged accordingly to work as per their skills.

Thirdly, it minimizes the leakages, misinformation and excludes human error.

Big data have scope in multidisciplinary fields such as security agencies aiding device, disaster management, economically calculating GDPs, financial data’s, environmental problems and many more.

Back to normalcy after COVID is challenging but with help of big data the world will be cognizable as a new efficient big data world.

]]>
info@nakala-analytics.co.ke (Nakala Analytics) Blog Thu, 02 Jul 2020 18:02:37 +0300
Analytics efforts work when focused to specific product performance https://nakala-analytics.co.ke/blog-list/analytics-efforts-work-when-focused-to-specific-product-performance https://nakala-analytics.co.ke/blog-list/analytics-efforts-work-when-focused-to-specific-product-performance

Analytics Efforts are Only Beneficial if Tailored to Address Specific Product Performance

Technology, innovation and ever-changing trends have entirely changed the way businesses operate and how we have been looking at products. Be it marketing or product design and development, everything is customized these days.

Customers are looking for a more tailored and personalized approach to everything, be it how you sell them the product or how you design and develop the product itself.

Photo by Timur Saglambilek from Pexels

Data analytics has played a crucial role when it comes to designing and developing customer-centric products. But, how exactly do we use data analytics for addressing product performance?

It’s true that leveraging the power of data analytics to the maximum can enhance the proficiency of the products, improve advertising techniques, and support business growth. But, it’s also true that analytics efforts are only beneficial if tailored to address specific product performance.

You must be right in your approach here. Let’s see how this unfolds and what exactly do we mean.

The Role of Analytics

In the simplest of terms, analytics measure the state of the product. This can be anything how users are interacting with the product, what they are doing, where they are clicking and so on.

The purpose of analytics is to judge what is going on with the product, as measured by various metrics. And all of these insights when interpreted the right way, help with product improvement.

Analytics is the primary source of feedback you get on your product. Analytics is crucial to product management and product improvement. Without analytics, you won’t really know ever what’s going on with your product or if you are headed in the right direction or not.

The key results, insights and metrics brought to the forefront by analytics helps product teams make informed decisions about what’s not working out, what product functionality needs to be upgraded or what specific feature demands additional capabilities.

And, this is also the primary reason why your analytics efforts should always be focused on a specific part of the product performance, rather than taking everything into picture at once. Without analytics, product teams would never realise or understand if the revisions implemented have been able to solve customer’s problems or not.

What you don’t measure, you can’t improve.

And, if you measure as a whole, you can’t pinpoint where exactly the issue lies.

Directing Your Analytics Efforts in the Right Direction

What a lot of businesses do while implementing their analytics plans is to throw in a lot of seemingly complex and rich-in-insights analytics packages and track almost all sorts of data relevant to the product “as a whole”.

However, this approach seldom works!

 

Don’t do this!

This approach never helps because to begin with, you, as a product manager, didn’t know what you are looking for.

Not every feature of the product is data driven and not every feature plays the same role in making the product a success. Before implementing your analytics efforts, you should think about studying what analytics would help you reflect upon the performance of the product the fastest.

Going the other way round, you’d just end up with an overwhelming volume of data. You won’t have any vision about it and you would just feel drowned in this sea of data ending up latching onto the vanity metrics.

Thus, it’s super important that before you implement your analytics plan, you should be crystal clear about what parts of the product performance you need to track, and what exactly your end goal looks like, what data is relevant to you.

 

The key is to track relevant data points, not a whole lot of data!

 

Start with creating a plan that couples the data points you measure with the product vision you and your team had at the beginning of the development and design process along with the product’s key performance indicators (KPIs).

Pros of Working With Specific Data Points

Easy to Report

When studying the feedback for a product, you are expected to define if the improvements introduced have been a success or a failure.

And, for that to happen, you must understand the architecture of the product very well and see what metrics define the success or failure of what feature and what still needs to be worked on.

As against being drowned in a sea of data, tracking data relevant to achieving KPIs makes it easy to report and interpret.

If you don’t report the analytics you track, it’s a waste of time to track them anyway.

 Photo by Startup Stock Photos from Pexels

Common reporting methods such as trends and comparisons make sense only when you report them specifically for a functionality. It would be a great value addition if you are also capable of reporting them using visualization techniques.

For instance, if you are managing a social media platform, it makes more sense to individually track and report analytics on specific features such as the share option or the search option.

You should be focused on understanding what issues the audience is encountering with these data driven features of the product, rather than the product as a whole.

Helps Deliver Relevant Products

While you are focused on improving one feature at a time, you deliver better products with relevant features. Understanding customer insights and improving a particular part of the product helps decrease complexity.

Effective data collection and analysis helps companies stay competitive and on top of trends. Plus, leveraging predictive analytics helps get insights on what is expected from brands in the coming times and what pain points people are struggling with.

Thus, in addition to improving existing products, companies have an excellent opportunity to expand on new markets and develop new products. The optimization of the trial page of Volusion can be a good example of tailoring analytics efforts to a specific part of the product. To improve the lead generation rate, a new registration page was created and an A/B test was run against the then-current trial page.

The previous trial page was overloaded with information about the product and had a lot of CTAs and places to click around. Analyzing this, the newly designed trial page was modified and had some information about the trial (“No credit-card required” ) and removed all the possible distractions.

Thus, rather than modifying the entire product the lead conversion analytics were used to address the issue with the trail page alone. Further, when this didn’t work, the analytics efforts were narrowed down by segmenting the audience on the basis of location.

Informed Product Decision Making

Focusing on one feature at a time makes it easy to make informed decisions and wise choices.

Delivering relevant products also includes NOT overloading your product with irrelevant features. It is important to design the architecture of your product in such a way that it solves the problem of the user without feeling overcrowded.

And, this is possible only when you think about each of the features individually rather than focusing on the product as a whole. Analytics are vital for product design, development and improvement as they tell you what exactly is going on with your product and how your audience has been receiving it.

 Before you think you are all set to launch your product, you must understand and decide what needs to be tracked and reported. This forms the criteria for further choosing what data points out of all are relevant, how to measure it and how to use it for product improvement.

 

 

 

]]>
info@nakala-analytics.co.ke (Nakala Analytics) Blog Wed, 17 Jun 2020 12:41:14 +0300
How Natural Language Processing Will Drive the Era of Online Doctors https://nakala-analytics.co.ke/blog-list/how-natural-language-processing-will-drive-the-era-of-online-doctors https://nakala-analytics.co.ke/blog-list/how-natural-language-processing-will-drive-the-era-of-online-doctors

 

Artificial Intelligence and technology in general has exercised its impact on all walks of life. Medicine and health care is no exception.

Quick and impressive improvements in AI have already taken healthcare by storm.

Be it improving the accuracy and efficiency of diagnosis of ailments and treatment across various specializations or speech recognition in clinical documentation,  AI has been miraculous.

The innovation has been such that a lot of experts believe that AI should even be able to replace doctors altogether, especially radiologists.

The speculations that your future doctor may not be a human are at an all time high.

Photo by Edward Jenner from Pexels

To what extent are these claims true and would AI be able to replace doctors soon?

How various fields of AI such as Natural Language Processing and Machine Learning would transform medicine in the coming years?

Although we are not very convinced by the claim that AI would be able to replace doctors altogether, we can consider the possibility of AI augmenting the role of doctors and helping introduce some key transformations in the way the healthcare sector has been operating.

Let’s study this further and understand how Natural Language Processing will drive the era of online doctors.

Natural Language Processing in Healthcare

Healthcare databases are growing at an unimaginable pace.

There is lots of data.

Text analytics and natural language processing (NLP) help extract information out of this data, they help turn this data into something that holds value.

4 Key Areas in Healthcare Where NLP Would Help

Translating Free Text into Standardized Data

Natural Language Processing helps complete electronic health records by translating free text into standardized data, while also enhancing the accuracy of the same.

NLP can help us get a lot of meaningful information accessible by free-text query interfaces.

We can expect the future doctors to dictate their patient notes to a robot or an interface, thus saving time spent on maintaining documentation and paying undivided attention to the patient.

Also, it may be possible for the doctors to create customized educational materials for patients ready for discharge.

Furthermore, given an arbitrary piece of text, NLP could be used to identify and extract keywords such as symptoms or pain points and with precision.

This can take the medical world by storm because it may serve as a replacement for a doctor.

The precision of extraction can be questionable though, to begin with and may need a real doctor for supervision.

Extracting Information from Vague Notes

Many a time while recording information, doctors keep typing everything in one place.

Generally, this happens to be the notes section of the electronic medical record.  

These blurbs are mostly lost and even if we want to revisit this unstructured information for crucial insights, it’s a manual process that consumes a lot of time.

Natural Language Processing can be employed here to automatically extract these insights and save a lot of time.

Further, these blurbs don’t only have clinical information about the patients but a lot of social as well as cultural insights as well, mentioned casually while taking notes, which can’t be left out just like that.

Making it Easy for Radiologists

Radiology has seen a lot of improvements over time.

Photo by Anna Shvets from Pexels

Today we have better machines for ultrasound, Computed Tomography (CT), Magnetic Resonance Imaging (MRI), and Positron Emission Tomography (PET) scanning technologies.

However, in the era of online doctors, the next improvement won’t come in the form of an improved ultrasound machine but AI utilizing the imaging data and interpreting it to provide an output.

This includes steps such as automatic segmentation of various structures within CT and MR images.

In the coming few years, AI will be a common sight in a radiologist’s clinic.

Medical images may be pre-analyzed by an NLP tool before being reviewed by a radiologist.

The tool would augment the radiologist’s job and help perform key routine operations such as pattern recognition with ease.

Answering Common Queries

There can be a lot of days when the ailment is not serious enough to visit the doctor’s clinic.

Or, there can be days when you need answers to some common questions but won’t find them on Google because they are unique to your situation.

NLP can be called to your rescue here.

In the era of online doctors, NLP can be used to create something that responds to medical queries from both patients and physicians.

For instance, you may want to ask “Can I take Paracetamol while I am pregnant?” or “I have been taking this drug and now I’ve been feeling a bit nauseous, is that normal?”.

For such queries, NLP can be used to extract the relevant medical terms and surrounding context.

Further, using these we need to retrieve the documents most responsive to the question terms from a repository of curated answers, pretty much like Google serves answers to search queries.

However, this is not similar to robots that have a predefined set of answers to a predefined set of questions. It requires the best of AI to come up with answers of high clinical precision.

NLP in Action

3M is a renowned provider of NLP empowered solutions for healthcare.

3M’s NLP empowered product helps automate the process of extracting useful information and numerous clinical concepts from unstructured data. This includes the free text in notes of the doctor, EHRs and other reports.

The software scans doctor’s resorts containing unstructured data using NLP and turns it into meaningful information that can be further processed.

Photo by Gustavo Fring from Pexels

Amazon offers a product called Amazon Comprehend Medical that makes it very easy for doctors to extract relevant information from their notes and blurbs.

Everything about the patient right from the ailment, prescribed medicine along with its frequency can be collected using a number of sources such as the notes taken by the doctor during consultation and patient health records.

The tool claims that this data can further be used to select a set of patients for a clinical trial of a vaccine or any other experiment, or just to segment patients for better understanding of their symptoms and effects of certain medications on them.

Thus, these were some of the ways in which NLP has been and is expected to revolutionize the medical world.

In an era where we are looking for mechanical replacement for almost everything, NLP can be critical.

NLP is what helps machines understand the human language with context.

The key to develop better solutions is to focus on creating algorithms that are not only accurate, but increasingly smart and intelligent, and specific to healthcare.

The role of NLP and AI in healthcare is more about augmenting the role of doctors rather than replacing them altogether.

If we are able to achieve this, there is no limit on what doors could be open in the future.

]]>
info@nakala-analytics.co.ke (Nakala Analytics) Blog Fri, 12 Jun 2020 16:11:40 +0300
How to Use Data and Analytics to Tell a Story https://nakala-analytics.co.ke/blog-list/how-to-use-data-and-analytics-to-tell-a-story https://nakala-analytics.co.ke/blog-list/how-to-use-data-and-analytics-to-tell-a-story

 

Hubspot says that billions of people use social media globally. Do you agree with the fact that if your business is not online, sooner or later it will run out of business? With such overwhelming engagement of people on social media, undoubtedly it has emerged as one of the best channels to get new leads and promote your business. However, how do you catch your audience’s attention citing such huge competition?

Why Storytelling?

What is it about your business that drives people crazy? What is it that forces them to press that “like” button on your post or leave a comment in the “comments section below”?

Photo by Lukas from Pexels

Well, the goal is to make a human connection. The goal is to stand apart from the crowd mechanically leveraging social media handles. It’s a well-established fact that all of us love stories. We relate to a well-told story like nothing else. Stories help your prospects make sense of the decisions they are about to make. Whether it’s the decision of subscribing to your email list or buying your product’s annual subscription. Your story is “why” you are doing “what” you are doing and “how” it makes a real difference to the world.It’s about standing out and not blending in! But, how do you tell great stories? What does a great story look like?

What Do You Need to Tell a Great Story? 

A Great Story = Visualization + Context + Content

Data, visuals, and graphics have been used to tell great stories for long. However, over the past couple of years, the “greatness” has disappeared from the stories. Plain stories are left behind. A graph chart displaying some statistics related to business intelligence, and analytics, that pretty much sums how data is leveraged to tell stories. Telling stories using data and analytics, which people can actually relate to, demands rich and intuitive data visualizations.

Overwhelming information with flashy and non-targeted visualizations (hard to decipher) ruins almost everything a great story has to tell. Telling stories with data and analytics has a lot more to it than creating a bar chart and uploading it to a dashboard or PPT. A context and an accompanying content are indispensable. 

A great story using data and analytics must convey why the information being shown is relevant to a business’s strategies and operations or how it solves the pain point of its consumers and potential customers.

What else does a great story take? It takes detail! Detail to an impressive level. The seemingly insignificant details must paint a picture in someone’s mind to truly make the story complete. Including data and analytics in your marketing strategies should enable you to tell the stories crucial for successful campaigns and customer journeys. However, understanding all the data correctly, extracting the key takeaways, and turning it into a great story is not easy. A lot of organizations struggle hard with this.

How to Tell a Great Story Using Data and Analytics?

It’s All About Engagement

Some of the great minds working in the data and analytics can comprehend huge data sets but they often fail when asked to help someone else understand the findings. Stories are all about engagement. One of the must-have skills for those working with data analytics is the art of presenting data in an engaging way. Your story should be digestible and should get people asking questions. Commonly people turn to data visualizations for this, but your creativity lies at the heart of finalizing what visualization brings the best out of data.

Because, not all graphs are easy to understand! Brainstorm hard and finalize how you want to present the data. Whether you want to go for charts, graphs, infographics, etc. Studies indicate that audiences prefer visual elements to numbers in presentations and will remember information more accurately and till long when presented to using visualizations. You can turn to tools such as Taswira for a new way to report and tell effective data stories. The tool helps you turn data into amazing visualizations with a context, driving the idea home properly.

It’s Not an Art Project

Your business story is not your art project. Period. Yes, colors enhance engagement, but they should be pleasing for the eye, not distracting. Under no circumstances and at no cost should your story distract the audience from the key takeaway. Don’t use a pie chart just because you see everyone using it, or you personally like it. The design element should help people relate to the story and find it engaging but it should not be the best and the only thing about your story.

It Must Have a Context

If your story has plain figures, facts, and data, it may essentially not be a story. Understand your audience well and structure your story in phases for them. Help them have a context. Help them understand “why” you are saying “what” you are saying and “how” it would help them.

 Odds of a significant section of your audience needing help understanding where you are coming from are pretty high. If you have multiple visualizations to share, have a linear approach. Start with the essential background they may need to grasp what’s underway.

While some of the phases of the story may be more relevant and important, it is important that you pay due attention to each without taking forever to complete it! Work on telling it in such a way that the listeners feel like they are there, living the situation.

Have a Timeline

Finally, your story must have a timeline. You should seem like progressing onto something. Don’t make it all about haphazardly conveyed, bland facts and figures. Tell your story in a clear beginning, middle and end order. Remove everything extra.

While studying data sets you may have a lot to share, but you must filter what’s essential and what’s not. Filter your findings and work on ways to present them in a linear fashion, progressing gradually from the beginning to the end. Stories are how audiences remember what you said.

Storytelling in Action

One of the best examples of brands that have experienced the difference storytelling makes is Unthinkable Media. The brand worked hard for years to gain capture the audience’s attention. Surprisingly, when traditional methods failed, storytelling came to their rescue. The company shifted its focus from impressions to subscribers. Things changed drastically when the firm focused on creating a community and making people a part of their story! In an era where a lot of people are selling what you are selling, it’s your story that differentiates you and provides you with an identity of your own.

And, that unique identity is your USP, that is what sells! When you tell the “Why”, you are making a human connection and engage people well and that works without fail. Happy storytelling!

]]>
info@nakala-analytics.co.ke (Nakala Analytics) Blog Fri, 05 Jun 2020 00:02:25 +0300
How Customer Segmentation Using Machine Learning Can Help FinTechs Survive Post COVID-19 https://nakala-analytics.co.ke/blog-list/how-customer-segmentation-using-machine-learning-can-help-fintech-post-covid-19 https://nakala-analytics.co.ke/blog-list/how-customer-segmentation-using-machine-learning-can-help-fintech-post-covid-19

 

Over the years, Fintech has grown tremendously.

Last year, global Fintech funding rose to over $100 billion, thanks to huge, successful rounds of financing. 

Worldwide Fintech investments were expected to grow even more this year, but COVID-19 hit the entire world like an unwanted, unpleasant surprise and the industries, including Fintech, were forced into a never-ending state of uncertainty.

However, there always is light at the end of the tunnel.

Hoping that pandemic will live its due course and get eradicated totally by 2021, it’s high time that we start visualizing a post-COVID-19 world and devise strategies to march out of this victorious.

What will Fintech look like once the pandemic is over? How soon would it be up and running?

What’s the way out of all this and what post coronavirus changes need to be adopted to have a sound, functioning world?

Let’s approach all of this one after the other and try understanding how Fintech will fare through the coronavirus crisis.

Also, we shall explore an interesting possibility and see how AI, and particularly Machine Learning and predictive analytics can help Fintech be its best post the pandemic.

 

How Will Fintech Fare Through the COVID-19 Crisis?

Financial technology has been revolutionary for finance.

However, the demand for Fintech services across the globe is dependent on economic activity.

With everything on hold and close to zero economic activity due to strict lockdowns and social distancing measures in place, payment revenues are expected to drop by anything between 8-10%.

Spending is decreasing and so is the overall number of transactions in the Fintech space, hence a decline in the demand for services.

A significant section of Fintech relied on scaling up the customer base and earning small profit margins from money transfer and payment services. This section will suffer a blow.

In addition to this, the Fintechs that depended on international transactions including travel spending or other forms of payment would also suffer because of global restrictions on travel and trade.

There is a widespread lack of funding as well. Several budding Fintech startups will not be able to cope with this and would either fail miserably or witness a massive decline in their valuations.

The Flip Side

Seeing the other side of the coin, COVID-19 times are also the times when a lot of Fintechs, with a little smart work, can position themselves strongly.

How? With social distancing and restricted movements of people, Fintechs can assume an important responsibility.

If approached in an innovative manner, Fintechs could be the bridge between a failing economy and the recovering economy.

Across the world, the governments are injecting cash in the economies to keep the companies from defaulting and helping them run the minimal operations necessary or to start afresh post months of lock down.

The circulation of these large amounts of cash through various coronavirus special schemes in such a short period of time calls for a strong financial infrastructure and supply chain in place, preventing people from actually having to stand in long queues outside the banks.

Here, Fintechs can step in and facilitate these credit requests from businesses and funding to individuals who need it.

Using Customer Segmentation for an Enhanced Outcome

The Survival of the Fittest

For Fintechs, the new normal means the survival of the fittest.

Quality is the key metric for success.

It won’t be wrong to say that the future of Fintech has never been brighter actually but only if they are able to raise the standard and emerge as a strong player, leveraging technology and modern innovation.

AI in Fintech Post Coronavirus

 

Source

Technology, especially AI, can be of immense help here. Understanding your customers better through customer segmentation using machine learning has immense untapped potential.

Smartly deploying and accelerating the integration of big data and Artificial Intelligence (AI) powered solutions can be a game-changer.

Fintechs can employ these smart solutions to store, process, analyze, understand, and drive insights from huge sets of data about their customers' behavior and their social and browsing history.

Furthermore, customer segmentation provides insights into what trends are underway. Leverage this to answer critical business questions for the future and make smart, data, and logic-based decisions. 

Customized Financial Products

Empowered with accurate customer trends, data, and preferences, Fintechs would be able to invent highly customized financial products and services that traditional banking never could.

Yet another opportunity here for Fintechs is to serve those customers who can’t approach banks for credit due to lack of reliable collateral.

Targeted Marketing

The integration of AI-powered solutions makes the delivery of a targeted marketing experience possible.

An abundance of financial products and services in the market today makes it difficult to judge which product would cater to which customer segment the best.

When you are struggling to survive as a firm, you can’t take too many risks. This becomes even more relevant post coronavirus.

Talking about risk management, AI can help you examine data points from credit bureau sources and assess credit risk for consumer and small business loan applicants. That way, you can assess the loan applicant beforehand and point out defaulters, if any.

Such AI empowered platforms like Underwrite collect portfolio data and use machine learning and customer segmentation to decipher and understand patterns to classify applications into good and bad applications.

This can highly help Fintechs to reduce their default rate.

Furthermore, there are platforms such as Ayasdi, that provide anti-money laundering detection solutions to help Fintechs understand and manage risk, and well anticipate the needs of customers.

Having accurate information about customer trends and preferences is more useful and relevant than anything else these days.

Customer segmentation helps decipher previously hidden patterns and get hands-on insights on behavior-based information.

Powerful AI solutions offer accurate results, which you can further use to target specific groups with financial products that will best resonate with their needs after the coronavirus crisis ends.

Macro-Economic Trends

Financial institutions across the globe have impressive amounts of data about macro economic trends.

This information can be life-changing for Fintech investors and policy-makers. However, due to the lack of necessary technology, it remains as it is with its immense untapped potential.  

Machine Learning and AI in general, can propose models and solutions to harness the benefits of this and provide the world with a roadmap to better Fintech services and products to cope with a dwindling economy due to the coronavirus crisis.

An Ever Agile Industry

There is no denying the fact that the Fintech revolution emerged partially from the ashes of the previous global financial crisis.

Key players in Fintech are habitual of dealing with such losses and uncertainty. They are used to surviving in an agile, uncertain environment and this would be their greatest strength in the times of this crisis.

Probably it’s too early to comment on anything about it but yes we are sure that irrespective of the economic hardship expected to accompany the coronavirus crisis, now is the time for investors to come forward and support Fintech, whose services are expected to enhance the end consumer’s financial wellbeing in a post COVID world.

With everyone going contactless, digital financial services are the future. These are more necessary than ever in an increasingly digitized, socially distancing world!

]]>
info@nakala-analytics.co.ke (Nakala Analytics) Blog Mon, 01 Jun 2020 15:48:43 +0300
The complexity of predicting COVID-19 fatality rates https://nakala-analytics.co.ke/blog-list/the-complexity-of-predicting-covid-19-fatality-rates https://nakala-analytics.co.ke/blog-list/the-complexity-of-predicting-covid-19-fatality-rates

Mathematical models have existed for the longest time ever known and for many reasons. Key and most important, they have been used to predict the future. As impressive as it sounds, it’s not always easy to make a perfect model as the outcomes vary a lot from reality. Often, there are many factors involved which change and are controlled by many secondary variables.

Today I want to briefly explain how challenging it might be to create a model that predicts pandemic outcomes. This is not meant to discourage anyone; no single model is wrong. Any measure that will reduce the fatality rates by even 0.1% is worth trying. The number of people who die is not statistics. Each death leaves behind a story. A story that affects lives and leaves emotions rolling down hundreds of thousands of families.

Getting started

I want to believe; this brief can help you understand the logic between modeled numbers and an accurate picture we are seeing.

To start off, let us explore some of the factors that determine the fatality rates resulting from COVID 19? Like many other predictive exercises, coming up with smart predictors early enough could guarantee quick success. However, picking the right predictors requires lots of problem understanding and personal intelligence. Ideally, one would run a model with all possible predictors only to remove those that are not significant. When there are many predictors, we need some strategy for selecting the best predictors. The complexity of modeling COVID 19 lies in the unavailability of descriptive predictors.

Before we look at the predictors, what is the fatality rate of COVID 19? A simple question, is it! To get the fatality rate, divide the number of people who have died from the disease by the number of people infected with the disease. Let us explore some variables that may go into predicting fatality rates for COVID 19. You are right to say fatality rate is a factor of:

  • The daily rate of the number of people already infected
  • How many people could eventually become infected
  • The mode of spread and how quickly we can prevent the spread
  • Number of people the virus can kill
  • Availability of a cure or preventive medicine
  • Availability of funds to procure PPES and conduct mass testing
  • How quickly the authorities can identify, track, and trace possible contacts of an infected person.

This may sound straight forward to many; however, each variable mentioned above could be or depends on several secondary variables. As we usually say, lack of data is data.

The inaccuracy of the dependent variable

Fatality rates is our dependent variable which is a challenge to get. We do not have a single confirmed fatality rate for COVID 19. Countries have their own calibrations. The fatality rate varies with age, demography, race, location, etc. As quoted above, to get the fatality rate, the number of infected people if a factor. Unfortunately, it is not known how many people have been infected. The best way to determine this number is by testing the 50M people in a country such as Kenya. This is unfeasible given the challenges experienced in testing.

To get an accurate tally, it might be important to emulate one of the cruise ships that got quarantined after a COVID-19 outbreak. Nearly everyone on board was tested. The close confines help the virus to spread, but closed environments are also an ideal place to study how the new coronavirus behaves. Unfortunately, the world isn’t a confined ship. When COVID-19 was detected among passengers on the cruise ship Diamond Princess, the vessel offered a rare opportunity to understand features of the new coronavirus that are hard to investigate in the wider population. Some of the first studies from the ship — where some 700 people were infected — have revealed how easily the virus spreads, provided estimates of the disease’s severity, and allowed researchers to investigate the share of infections with no symptoms. The results of this unusual setup suggest that there are many people walking around with COVID-19 who don’t know it — and, consequently, that the death rates are lower than other data has suggested. Lack of accuracy in calculating the predictor breaks the York. Have you ever boiled an egg with a cracked York?  Well, if you are hungry, it doesn’t matter.

Messiness of data

At the moment, there have been rumors and blame games over cases of under-reporting or misrepresentation of the impact of COVID 19 in China. This is a typical example of how messy the data situation is Global. When data is extracted from disparate databases, the inevitable result is data inconsistencies, and nobody trusts the numbers. A lack of centralized global process, data management, and inadequate data strategies towards combating COVID 19 has contributed widely towards inaccurate data. Countries and regions collect data in different ways. There’s no single spreadsheet everyone is filling out that can easily allow us to compare cases and deaths around the world (fivethirtyeight).

The “nature” of things

The challenge of foreseeing the future of such a pandemic guided by data has many inconsistencies. Apart from being structural in nature, for some countries, testing is stratified. Kenya is testing people in isolation camps as well as targeting areas with a high potential of community infections, while other countries are testing everyone.

The virus itself discriminates in nature. Africa whose majority population consists of the youth has recorded high rates of infection among people between ages 10 and 60 yet in other western countries, COVID 19 has killed the aged.

There are many tracked and untracked factors that affect the fatality rates because of a pandemic some of which are:

  • Hospital Capacity - Ability to prevent death once someone is grievously ill which depends on hospital capacity
  • Infection rate - This depends on the willingness of the population to wash hands, maintain social distance, and report suspected cases.
  • Rate of contact - how many people an infected person interacts within a given time period
  • Rate of transmission per contact
  • Symptomaticity ratio
  • How long the virus can survive on a surface
  • How far it can be flung through the air
  • Duration of infectiousness

Our team in Nairobi is working hard to create a centralized database of the COVID pandemic and hopefully, in due course, we shall make it public.

 

 

 

]]>
info@nakala-analytics.co.ke (Nakala Analytics) Blog Thu, 07 May 2020 22:30:44 +0300