📣 Announcement

Introducing Unnug

Compileyourdatapipelineswith Unnug

Unnug is a cloud compiler for your serverless and analytics workload.

Coming soon ..

Hero Video

Problem

Building data pipelines is super complicated.

Infrastructure Overhead

Data engineers are losing up to 44% of their time on infrastructure tasks when building and maintaining data pipelines. These tasks are often repetitive, tedious, and don't contribute directly to core business value.

Difficult to Test / Reproduce

Traditional pipelines often have tightly coupled components, making it challenging to test individual parts in isolation. Pipelines may rely on specific infrastructure setups, making it hard to create consistent test environments.

HARD to change / port

Modifying existing data pipelines is often complex and time-consuming. They're also challenging to modify and version control effectively.

Solution

Build Pipelines, Not Infrastructure

Focus on Your Data's Journey, We'll Handle the Rest

Simplify data pipeline infrastructure

Unnug provides a comprehensive toolset that abstracts away the complexities of data pipeline infrastructure.

https://unnug.com

Enable declarative pipeline development

Unnug introduces a declarative approach to pipeline development, similar to how Terraform revolutionized infrastructure-as-code. Users define their entire pipeline, including data sources, transformations, and destinations, in a clear, concise configuration file.

https://unnug.com

Provide cloud-agnostic deployment

Unnug's platform allows users to build and package their data pipelines locally using familiar tools and their preferred programming languages. These packaged pipelines can then be deployed effortlessly to various cloud platforms or Unnug's own cloud infrastructure.

https://unnug.com

Build and deploy using CLI tool

Unnug offers a streamlined command-line tool that simplifies the entire pipeline lifecycle. Users can download this tool from Unnug's website and use it to build, package, and deploy their data pipelines with simple commands right from their terminal.

https://unnug.com

How it works

Just 3 steps to get started

    1. Define pipeline configuration

    This configuration acts as the blueprint for the entire pipeline, specifying key aspects such as Data Sources, Data Destinations and Data Transformations.

    2. Define your transformations

    This step defines how data should be transformed from its raw form into a desired, structured output. user-defined transformations give developers full control over how data is manipulated, enabling custom logic tailored to specific business needs or technical requirements.

    3. Build and deploy using CLI tool

    Unnug provides a command-line interface (CLI) tool that streamlines the build and deployment process. Users download this tool from Unnug's website. With a simple command, the tool packages the configuration and all necessary components. The same tool can then deploy the packaged pipeline to the user's chosen cloud platform or Unnug's cloud.

Features

User Flows and Navigational Structures

    Select your data sources

    Choose the origin of your data, such as databases, streaming services, cloud storage, or file systems. This step defines where your raw data is coming from.

    Select your data destination

    Specify where the processed data should go, like data warehouses, storage services, or external applications. This ensures the transformed data reaches the desired endpoint.

    Define your compute function

    Write custom data transformation logic using your preferred language to clean, aggregate, or manipulate data to match your business needs.

    Select your compute processing framework

    Choose a processing framework like Apache Flink or Apache Spark, which will handle the execution of your transformations in a scalable and distributed manner.

FAQ

Frequently asked questions

Still have questions? Email us at info@unnug.com

Ready to get started?

Start your trial today.