Data Delivery Engineer

 BaseCap Analytics
 USA
 2 years ago
 None

Headquarters: New York 
URL: https://basecapanalytics.com/

Data is at the heart of cost efficiency, business processes, and decisioning. Digitally savvy firms have an edge to benefit from new technologies.

 

BaseCap is a rapid-growing software company that helps organizations improve their data quality twice as efficiently, compared to existing solutions. We created a software platform that frees up teams from cleaning and setting up data and allows them to focus on creating accurate, actionable insights for their company. 

As a Data Delivery Engineer, you will be responsible for executing solutions that meet client needs. You will be the primary technical contact at client sites and will be responsible to complete technical proof of concepts, building and maintaining mission critical data pipelines, and sizing future development projects.

 

Responsibilities

 

  • Present audience-appropriate explanations of data engineering practices to technical teams
  • Work hands-on to create proofs-of-concept to demonstrate ideas and concepts that may be foreign to customers or prospects
  • Build environments for storing raw data, utilizing Azure Data Lake Storage, SQL server, Blob Storage and other technologies
  • Implement data warehouses and analysis services as part of a modern data warehouse (MDW) design
  • Utilize tools such as Azure Data Factory, Azure Functions, and Azure Stream Analytics to perform data orchestration and transformation
  • Live and breathe the corporate values fueling our success and make us who we are

Qualifications

 

  • Expert experience with SQL, including ability to understand, edit, or maintain existing stored procedures
  • Experience building and maintaining pipelines in ADF.  This is a hard requirement.
  • Experience building/maintaining environments for storing raw data including Azure Data Lake, SQL server, Blob Storage and other technologies
  • Experience with other Microsoft Azure services, including
    • Azure Data Lake Storage
    • CosmosDB
    • Azure Databricks
  • Experience with other NoSQL storage, document databases, and graph databases is a plus
  • Experience with Spark SQL or Hive SQL is a plus
  • Experience working with customers to elicit business requirements and demonstrate technical concepts in an understandable fashion
  • Willingness to learn in a high-paced environment
  • Ability to embrace feedback and hold yourself accountable

 

 

This role can be performed remotely from anywhere in the US

B

Share

 Twitter

Or view all jobs:

  • JS Remotely
  • PHP Remotely
  • Java Remotely
  • Python Remotely
  • Ruby Remotely
  • Designer Remotely
  • Develops Remotely
  • DotNet Remotely
  • GoLang Remotely

DevOps Remotely

We are a part of the Go Remotely group, specialized in recruiting anywhere in the world.

JS Remotely

PHP Remotely

Java Remotely

Python Remotely

Ruby Remotely

Designer Remotely

Develops Remotely

DotNet Remotely

GoLang Remotely

Copyright © 2024 DevOps Remotely | Privacy policy