NexaIntelligence is our social-media discovery and monitoring
platform for those who need to extract actionable insights out of
discussions to inform decision-making.
Current languages supported: English, French, Russian, and Korean (more coming soon).
The system collects and analyses data from Twitter, Facebook, Tumblr, blogs, web forums, online news sites, Google Alerts and RSS feeds. With it, you’ll be able to make qualitative analyses based on both quantitative and qualitative data so you can provide context for the numbers, not just spreadsheets.
When exploring Twitter data, users immediately have access to:
Our newest tool will include novel entity extraction and text summarisation functions. This will allow users to input large amounts of text or social media data into the system and receive:
Download our Hurricane Irma case study to see how the tool works.
The future is bright, and we have big plans for it!
We’re not just building tools to make your life easier today, we’re also thinking about the needs you’ll have tomorrow. Our current tools are the basis for a future cutting-edge AI product we refer to as NexaAgent.
As its name suggests, this AI will be your agent for all questions requiring social data mining and analysis. By integrating the functionalities of NexaIntelligence and Augmented NexaIntelligence, NexaAgent will be able to answer any of your questions, such as:
“NexaAgent – what are people talking about on Twitter in Venezuela?”
Plus, it will be fully responsive and optimized for mobile and tablets, as well as wearables, meaning you will have an incredible source of valuable information at your fingertips! By using our current products, you’re contributing to the creation of a life-changing product.
Our technology enables law enforcement agencies (LEAs), security enterprises and intelligence operations to combat cyber crime, identify cyber criminals and their relationships as well as detecting online security threats.
Natural Language Processing (NLP) embraces methods of working with human language from the field of linguistics, computer science, statistics and mathematics. It enables the processing of massive amounts of textual data without human intervention at a large scale. NLP is able to detect sentiments of the author, the most important issues in a conversation and other features. It may be used for simple tasks (such as identifying the same names in various formats, differentiating spam from non-spam), to very complicated issues (such as detecting emotions and human intention). Natural Language Processing is anything but simple keyword counting. Our patent-pending Natural Language Processing methodology facilitates the extraction of topics, concepts, entities, key ideas from any human generated speech, post, call, or other digital data.
Social Network Analysis is an interdisciplinary approach revealing the connection between the members of the analysed community using sociology, statistics and graph theory. It depicts what communities people in the analysed network form, who are the most important actors in the given community, and what relationships and importance the actors have based on their network topologies. Social networks visualised dispose unnoticed ties, reveal patterns in the data unseen with other methods. Amongst others, the source of the networks can be surveys, observations, or social media data. Social Network Analysis depicts the relationship between users of a given platform, let it be Twitter, Facebook, Instagram etc: who talks to whom, who is marginalised in a conversation, who are the “bridges” connecting sparse communities, and what is the flow of information.
Machine Learning is an area of computer science, which makes computers learn without much of human intervention. The goal of Machine Learning algorithms is to create a general model based on data in order to be able to interpret other data never seen before. Amongst others, the outcome of Machine Learning could be a prediction to the future, detecting sentiment or topics in tweets, distinguishing the brand Apple from the fruit apple – without human intervention. Deep Learning is a recently emerged subfield of Machine Learning, inspired by the structure and function of the human brain’s neural network. Deep Learning aims at unifying Machine Learning with Artificial Intelligence by building more complex neural networks artificially. The several processing layers enables the machine to analyse complex structures.
A good Data Visualisation is as telling as an essay on the very same topic. Also, how to interpret them must be clear for non-technical persons. Graphics enable humans to explore data easier than browsing a spreadsheet of thousands and thousands of rows. Nevertheless, the clarification of data using visualisations does not necessarily mean the simplification of visuals. As Edward R. Tufte, the celebrated authority in information design put it, “…the task of the designer is to give visual access to the subtle and the difficult – that is, the revelation of the complex. Being aware of the principles in human visual perception, we use intuitive data visualisations that enable the smooth interpretation of the most important aspects of the dataset analysed.
© 2018 - All Rights with Datametrex AI Limited