Wallscope’s platform is a versatile data processing engine that uses AI to link and contextualise data, making it easier to search, navigate and report large amounts of information.
Our core platform has been tested and delivered within a range of market sectors, with clients and partners including the Scottish Government, Public Health Scotland, the University of Edinburgh and Amsterdam University of Applied Sciences. It incorporates a range of tools which can be used to build bespoke solutions, workflows and interfaces that meet a range of business needs.
Wallscope is continuously developing, training and testing Machine Learning models and we are deploying our latest models as functional processors available for industry-level deployment.
Our expertise is primarily focused on Natural Language Processing (NLP). Utilising state of the art NLP models we can extract contextual information from any electronic document. By combining these with an in-depth understanding of key identifiers needed for analysis, we can provide clear prototypes to effectively explore and evaluate the data provided.
Once extracted, we export information to a knowledge graph. Storing the data in this way enables representation of relationships, explicit and inferred, between entities. It also allows statistical analysis and modelling on macro and micro scales. Our platform’s processors incorporate clustering and aggregation algorithms which can be customised according to specific use cases.
Insights from the knowledge graph can be visualised and shared in customised dashboards, graphs and charts, and overlaid with maps or timelines for added context.
All of Wallscope’s services can be deployed and scaled within a secure and robust infrastructure, using the customer’s existing platforms or through a cloud-based service.
Using Linked Data from DBpedia allows us to enrich and add value to our applications, helping users to discover contextual relationships and uncover new insights.