Application Model and Engine

Composable Analytics uses a data-flow-based programming methodology. An application within the environment consists of modules and connections between the modules. Modules have inputs, perform an execution step, and then produce outputs. Modules are chained together using connections forming directed graphs that are called dataflows. The model does not have any concept of variables, and therefore all state must flow in and out of the modules. This produces a system with no “side effects”. So why is this so important? With this approach, the platform can systematically parallelize the entire application by analyzing the dependencies between the modules.

Access Control and Security

Users have the option to specify fine-grained access control on their methods and results within Composable Analytics. This includes giving certain users and groups read, write, execute, clone, and discover permissions. Applications also allow users to share capabilities when raw line-level data is too sensitive. Users can execute and have access to higher level aggregated views.

Analytics are Operational Capabilities

Ad-hoc analytics can go from an exploration phase immediately into an operational capability that can be executed and consumed using our robust and fault tolerant platform. This reduces the burden on developers because they don’t have to rewrite a business analyst’s prototype. There’s no need to build out web services, a business object layer, and a custom user interface. Composable Analytics provides an alternative approach, allowing users to author web services and create dashboards to display the information quickly and easily.

Dataflows Span the Entire Stack

While most systems have separate tools for extraction, transformation, loading, querying, visualization, and dissemination, Composable Analytics lets you do all of that using dataflow applications. Dataflow applications allow users to chain processing steps together (i.e. querying for data, aggregating, or visualizing), allowing for magnitude improvements in development efficiency and runtime performance. Users can develop apps that mash multiple web service together, move data into a data warehouse, or query and visualize information. It can all be done within Composable Analytics.

No Monolithic Data Model or Warehouse

Composable Analytics does not require users to model all their data before consumption. Nor does it require the users to develop adapters that siphon data into a single warehouse where it can be consumed (causing latency and duplication of data). Instead, dataflow application authors can query and cherry pick for the information they specifically need to answer the question at hand. The ‘answer’ can be in the form of plots, maps, tables, or data that can be consumed by another analytic.

Other Business Languages?

Composable Analytics provides a significant advantage to application authors compared  to other business process languages like BPEL (Business Process Execution Language), and EMML (Enterprise Mashup Markup Language), which use variables to share information between building blocks. In addition, authors of BPEL workflows or EMML mashups need to specifically use a parallel sequence or parallel ‘foreach’ modules, and structure the workflow to ensure no writes are executed concurrently to shared variables if they want to speed up certain parts of their analytics.

While BPEL is good at long running message exchanges and human interactions, Composable Analytics focuses on bringing together disparate datasets, disseminating results, and producing data feeds/services. Based on our studies, analysts have an easier time comprehending what the analytic is doing and how data is flowing through the application using Composable's flow-based methodology.  Developers also enjoy the self-service nature of the platform because they can focus on writing code in the “first class” modules, and they can let the analysts perform the linking and configuration of the application.


While executing the same code across 1000 nodes is exciting, we recognize that analytics these days are more complex. Agile, Just-In-Time Analytics involves multiple stages of processing. Each step might be querying, filtering, aggregating, reducing, scattering,  etc., and these steps can be parallelized within the application global view, or within each step. Parallelization is no longer an after thought, but simply built-in from the start.