Advanced Analytics uses sophisticated tools for granular data analysis to enable forecasts and predictions from data.

Quick Takeaway: Advanced analytics is a very effective form of data analysis because it allows you to dig deeper into data to predict the future of your business.

An algorithm is a set of instructions we give a computer so it can take values and manipulate them into a usable form.

Amazon Simple Storage Service (Amazon S3) is an object storage service that offers industry-leading scalability, data availability, security, and performance. This means customers of all sizes and industries can use it to store and protect any amount of data for a range of use cases, such as data lakes, websites, mobile applications, backup and restore, archive, enterprise applications, IoT devices, and big data analytics. Amazon S3 provides easy-to-use management features so you can organize your data and configure finely-tuned access controls to meet your specific business, organizational, and compliance requirements. Amazon S3 is designed for 99.999999999% (11 9’s) of durability, and stores data for millions of applications for companies all around the world.

Quick Takeaway: Amazon S3 is the world’s predominant secured storage service and fundamental to establishing a data ecosystem and culture within any organization. 

(aka Outlier Analysis) is a technique used to identify a random pattern in data, namely anomaly, that does not conform to expected behavior. This method has a range of real-world applications such as intrusion detection (strange patterns in network traffic signaling a hack) health monitoring (identifying malignant tumors in MRI scans), fraud detection (credit card transactions), technical glitches (malfunctioning equipment) and changes in consumer behavior.

Quick Takeaway: Anomaly Detection helps find unusual activity in data thereby indicating an area that needs further investigation.

Artificial Intelligence (AI) is the ability of a machine to perform actions that otherwise require human intelligence. For instance, tasks like visual perception, speech recognition, translation between languages, and decision-making, are supported and automated by AI using intelligent machines.

Quick Takeaway: AI enables computer programs to think and act like intelligent humans. Minus the mood swings.

Augmented Analytics uses advanced technology to independently examine data, reveal hidden patterns, provide insights, make predictions, and generate recommendations. Artificial Intelligence and Machine Learning tools are used to automate the end-to-end process, right from data preparation, insight generation, and explanation, to augmenting the output with visualization and narratives.

Quick Takeaway: Augmented Analytics has revolutionized the way people explore and analyze data on BI and analytics platforms. It is unanimously crowned “The Future of Business Intelligence.”

Quick Takeaway: Amazon S3 is the world’s predominant secured storage service and fundamental to establishing a data ecosystem and culture within any organization. 


Behavioral Analytics is a part of Business Intelligence that uses data to focus on how and why users behave the way they do, on social media platforms, eCommerce sites, while playing online games, and when using any other web application.

Quick Takeaway: Behavioral Analytics follows virtual data trails to gain insights into user behavior online.

Big Data includes a variety of structured and unstructured data, sourced from documents, emails, social media, blogs, videos, digital images, satellite imagery, and data generated by machines/sensors. It comprises large and complex datasets, which cannot be processed using traditional systems.

Quick Takeaway: Big Data is large volumes of data generated at high speeds, in multiple formats that can be of value when analyzed.

Business Intelligence (BI) is analyzing data and presenting actionable insights to stakeholders to help them make informed business decisions.

Quick Takeaway: BI enables the right use of information by the right people for the right reasons.


A broad term that refers to any internet-based application or service that is hosted remotely.

Clustering groups data points from multiple tables, with similar properties, together for statistical analysis.

Quick Takeaway: Clustering is an Unsupervised Machine Learning technique.


Dashboard is a tool used to create and deploy reports. It helps monitor and analyze key metrics on a single screen and see the correlations between them.

Quick Takeaway: Dashboard provides an overview of the reports and metrics that matter most to you.

A digital collection of data and the structure around which the data is organized. The data is typically entered into and accessed via a database management system.

Data-as-a-Service (DaaS) refers to data as a product that can be provided to the user demand, regardless of geographic or organizational separation between provider and consumer. 

Quick Takeaway: Quantar’s intended service offering. The provision of all-things data as a service. 

Data Blending is a fast and easy method to extract data from multiple sources and blend it into one functional dataset.

Quick Takeaway: Data Blending combines data and finds patterns without the hassle of deploying a data warehouse architecture, which is why it is preferred.

Data Cleaning is also referred to as data cleansing or scrubbing. It improves data quality through the detection and removal of inconsistencies and errors found in data.

Quick Takeaway: Data Cleaning transforms data from its original state into a standardized format to maximize the effect of data analysis.

Data Cube is the grouping of data into multidimensional hierarchies, based on a measure of interest.

Quick Takeaway: Data Cube helps interpret a stack of data.

Diagnostic Analysis (aka Root Cause Analysis) takes over from Descriptive Analysis to answer the question Why it happened. It drills down to find causes for the outcomes and identify patterns of behavior.

Quick Takeaway: Diagnostic Analysis provides reasoning for the outcomes of the past by breaking down the data for closer inspection.

Drill-Down Capability helps visualize data at a granular level by providing flexibility to go deep into specific details of the information required for analysis. It is an important feature of Business Intelligence because it makes reporting a lot more useful and effective.

Quick Takeaway: Drill-Down Capability offers an interactive method to display multi-level data on request without changing the underlying query.

Data Democratization enables all users to access and analyze data freely to answer questions and make decisions.

Quick Takeaway: Data Democratization is a ‘free-for-all’ access to data and its use. No holds barred.

Data engineering is all about the back end. These are the people that build systems to make it easy for data scientists to do their analysis. In smaller teams, a data scientist may also be a data engineer. In larger groups, engineers are able to focus solely on speeding up analysis and keeping data well organized and easy to access.

Data Fabric is a unified environment of data services that provide consistent capabilities namely, data management, integration technology, and architecture design being delivered across on-premises and cloud platforms. A data fabric ensures complete automation of data access and sharing.

Quick Takeaway: Data Fabric puts the management and use of data into high gear using technology.

A set of processes or rules that ensure data integrity and that data management best practices are met.

A data warehouse is a system used to do quick analysis of business trends using data from many sources. They’re designed to make it easy for people to answer important statistical questions.

The process of pulling actionable insight out of a set of data and putting it to good use. This includes everything from cleaning and organizing the data; to analyzing it to find meaningful patterns and connections; to communicating those connections in a way that helps decision-makers improve their product or organization.

Data Wrangling (aka Data Munging, Data Transformation) is the process of unifying acquired datasets with actions like joining, merging, grouping, concatenating, etc. and cleansing it for easy access and further analysis.

Quick Takeaway: Data Wrangling is the step between data acquisition and data analysis.

This machine learning method uses a line of branching questions or observations about a given data set to predict a target value. They tend to over-fit models as data sets grow large. Random forests are a type of decision tree algorithm designed to reduce over-fitting.


Exception Handling is the process of responding to unexpected events (exceptions) encountered when a predefined set of steps is executed.

Quick Takeaway: Exception Handling deals with unexpected instances that may arise when an action is performed.


Exception Handling is the process of responding to unexpected events (exceptions) encountered when a predefined set of steps is executed.

Quick Takeaway: Exception Handling deals with unexpected instances that may arise when an action is performed.


Regulation in EU law on data protection and privacy in the EU. It also addresses the transfer of personal data outside the EU. 

A web-based repository that uses Git for version control.Quick Takeaway: A common platform utilized amongst developers and IT personnel.


An open source software framework administered by Apache that allows for storage, retrieval and analysis of very large data sets across clusters of computers.


A web service that enables customers to manage users and user permissions within AWS.

Quick Takeaway: Your unique user preferences. 

The network of physical objects or “things” embedded with electronics, software, sensors and connectivity to enable it to achieve greater value and service by exchanging data with the manufacturer, operator and/or other connected devices. Each thing is uniquely identifiable through its embedded computing system but is able to interoperate within the existing Internet infrastructure.



Key Performance Indicator (KPI) (aka Key Metric) is an important indicator that helps measure a department or organization’s performance and health.

Quick Takeaway: KPIs indicate how a business is performing based on certain parameters.



Machine Learning is an application of AI that enables computer applications to learn without specific programming using large datasets and improve when exposed to new data. ML is used to automate the building of analytical models.

Quick Takeaway: ML is the ability of machines to self-learn based on data provided and accurately identify instances of the learned data.

Metadata provides information about other data within a database. It provides references to data, which makes finding and working with collected data for the end-user easier in some cases.

Quick Takeaway: Metadata is data about data. For instance, username, date created/modified, file size are basic document metadata.



An outlier is a data point that is considered extremely far from other points. They are generally the result of exceptional cases or errors in measurement, and should always be investigated early in a data analysis workflow.


Predictive Analysis uses summarized data to answer the question What is likely to happen. It uses past performance to make logical predictions of future outcomes. It uses statistical modeling to forecast estimates. The accuracy of these estimates depends largely on data quality and details used. It is widely used across industries to provide forecasts and risk assessment inputs in various functions namely Sales, Marketing, HR, Supply Chain, Operations.

Quick Takeaway: Predictive Analysis attempts at predicting the future using advanced technology and skilled resources to analyze data and not look into a crystal ball.

Prescriptive Analysis is the last level of data analysis wherein insights from other analyses (Descriptive, Diagnostic, Predictive) are combined and used to determine the course of action to be taken in a situation. Needless to say, the technology used is a lot more advanced and so are the data practices.

Quick Takeaway: Prescriptive Analysis prescribes data-driven next steps for decision making.


Amazon QuickSight is a fast, cloud-powered business intelligence service that makes it easy to deliver insights to everyone in your organization.

As a fully managed service, QuickSight lets you easily create and publish interactive dashboards that include ML Insights. Dashboards can then be accessed from any device, and embedded into your applications, portals, and websites.

Quick Takeaway: QuickSight is a fast, cloud-powered business analytics service that makes it easy to build visualizations, perform analysis, and quickly get business insights from your data.

This field is highly focused on using algorithms to gain an edge in the financial sector. These algorithms either recommend or make trading decisions based on a huge amount of data. Quantitative analysts are often called “quants.”


Synchronously generated predictions for individual data observations 

Quick Takeaway: Effective for real-time decision making at all levels of partner organizations.

Preformatted instructions for common data transformations that fine-tune machine learning model performance.

Quick Takeaway: A type of machine learning model that predicts a numeric value.

A parent container for the accounts in your organization. 

Quick Takeaway: Useful as easy to have a consolidated view of the organization via a master account.


The application of statistical functions on comments people make on the web and through social networks to determine how they feel about a product or company.

Slice and Dice refers to the division of data into smaller uniform sections, that present the information in diverse and useful ways. For instance a pivot table in a spreadsheet.

Quick Takeaway: Slice and Dice is the breakdown of data into smaller parts to reveal more information.

Snapshot refers to the state of a dataset at a given point in time.

Quick Takeaway: A snapshot provides an instant copy of data, captured at a certain time.

Software as a Service (SaaS) is a delivery model for software, centrally hosted by the vendor and licensed to customers on a pay-for-use or subscription basis.

Quick Takeaway: SaaS packages and sells software as a service.

A programming language for retrieving data from a relational database.


Data that relates to the conducting of business, such as accounts payable and receivable data or product shipments data.


A person or application under an account. Each user has a unique name within the respective account, and a set of security credentials not shared with other users. Each user is associated with one and only one account. 

Quick Takeaway: Easy to define and identify respective users and adds an additional layer of security within the organization. 

Data that has no identifiable structure – for example, the text of email messages


Ensuring that data used in analytics is correct and precise.

A visual abstraction of data designed for the purpose of deriving meaning or communicating information more effectively. Visuals created are usually complex, but understandable in order to convey the message of data.


Web scraping is the process of pulling data from a website’s source code. It generally involves writing a script that will identify the information a user wants and pull it into a new file for later analysis.




Don't have an account yet ?