BI AND BIGDATA

Senior Data Engineer - London - Full Time Perm Hybrid - Base Salary - GBP £70,000 to £90,000

Senior Data Engineer

London

FullTime Perm Hybrid

Base Salary - GBP £70,000 to £90,000

 

 

BOUNTY DESCRIPTION

Skimlinks, a Connexity and Taboola company, drives e-commerce success for 50% of the Internet’s largest online retailers. We deliver $2B in annual sales by connecting retailers to shoppers on the most desirable retail content channels. As a pioneer in online advertising and campaign technology, Connexity is constantly iterating on products, solving problems for retailers, and building interest in new solutions.

We have recently been acquired by Taboola to make the first Open-Web Source for Publishers connecting editorial content to product recommendations, where readers can easily buy products related to stories they are reading.

Skimlinks, a Taboola company, is a global e-commerce monetization platform, with offices in LA, London, Germany, and NYC. We work with over 60,000 premium publishers and 50,000 retailers around the world helping content producers get paid commissions for the products and brands they write about.

 

About the role

We are looking for a Senior Data Engineer to join our team in London. We are creating a fundamentally new approach to digital marketing, combining big data with large-scale machine learning. Our data sets are on a truly massive scale - we collect data on over a billion users per month and analyse the content of hundreds of millions of documents a day.

As a member of our Data Platform team your responsibilities will include:

  • Design, build, test and maintain high-volume Python data pipelines.

  • Analyse complex datasets in SQL.

  • Communicate effectively with Product Managers and Commercial teams to translate complex business requirements into scalable solutions.

  • Perform software development best practices.

  • Work independently in an agile environment.

  • Share your knowledge across the business and mentor colleagues in areas of deep technical expertise.

 

Requirements:

Here at Skimlinks we value dedication, enthusiasm, and a love of innovation. We are disrupting the online monetization industry, and welcome candidates who want to be a part of this ambitious journey. But it is not just hard work, we definitely appreciate a bit of quirkiness and fun along the way.

·        An advanced degree (Bachelor/Masters) in computer science or a related field.

  • Solid programming skills in both Python and SQL.

  • Proven work experience in Google Cloud Platform or other clouds, developing batch (Apache Airflow) and streaming (Dataflow) scalable data pipelines.

  • Passion for processing large datasets at scale (BigQuery, Apache Druid, Elasticsearch)

  • Familiarity with Terraform, DBT & Looker is a plus.

  • Initiatives around performance optimisation and cost reduction.

  • A commercial mindset, you are passionate about creating outstanding products.

Voted “Best Places to Work,” our culture is driven by self-starters, team players, and visionaries. Headquartered in Los Angeles, California, the company operates sites and business services in the US, UK, and EU. We offer top benefits including Annual Leave Entitlement, paid holidays, competitive comp, team events and more!

  • Healthcare insurance & cash plans

  • Pension

  • Parental Leave Policies

  • Learning & Development Program (educational tool)

  • Flexible work schedules

  • Wellness Resources

  • Equity

We are committed to providing a culture at Connexity that supports the diversity, equity and inclusion of our most valuable asset, our people. We encourage individuality, and are driven to represent a workplace that celebrates our differences, and provides opportunities equally across gender, race, religion, sexual orientation, and all other demographics. Our actions across Education, Recruitment, Retention, and Volunteering reflect our core company values and remind us that we’re all in this together to drive positive change in our industry.

SKILLS AND CERTIFICATIONS [note: bold skills and certification are required]

·        Airflow

·        Python

·        SQL

·        GCP

·        BigQuery

·        Data pipelines

To Apply Please Complete the Form Below

Data Engineer - USA, Connecticut - $90,000 to $120,000

Data Engineer

USA, Connecticut

$90,000 to $120,000

 

Position Summary:

Reporting to the Associate Director, Information Technology, the Data Manager/Data Analytics is responsible for overseeing the development and use of company data systems and guaranteeing that all information to and from the company runs timely and securely. They will also effectively identify, analyze and translate business needs into technology and process solutions. This position can be based out of Westlake Village, CA/Marlborough, MA/Danbury, CT.

 

Principal Responsibilities:

  • Works with other team members and business stakeholders to drive development of business analytics requirements.

  • Leverages knowledge of business processes and data domain.

  • Brings deep expertise in data visualization tools and techniques in translating business analytics needs into data visualization and semantic data access requirements.

  • Works with various business units to facilitate technical design of complex data sourcing, transformation and aggregation logic, ensuring business analytics requirements are met.

  • Leverages enterprise standard tools and platforms to visualize analytics insights, typically working with and/or leading a small team.

  • Regularly monitor and evaluate information and data systems that could affect analytical results.

  • Translate business needs to technical specifications

  • Design, build and deploy BI solutions (e.g., reporting tools)

  • Manage integration tools and data warehouse.  

  • Manage and conduct data validation and troubleshooting

  • Create visualizations and reports according to business requirements

  • Monitoring and enhancing databases and related systems to optimize performance.

  • Proactively addressing scalability and performance issues.

  • Ensuring data quality and integrity while supporting large data sets.

  • Debugging and resolving database reliability, integrity, and efficiency.

  • As a member of the IT organization at MannKind Corp. the incumbent is also expected to be customer focused, a problem solver, a communicator, professional, willing to learn, organized and a team player.

  • Duties and responsibilities are not limited to the work listed above and may include other assignments as necessary.

 

Education and Experience Qualifications:

  • BS/BA Degree with minimum 3-5 years related experience in data management or analysis.

  • 3+ years of experience with Relational Database Management Systems (RDBMS)

  • 3+ years of Business Intelligence / Analytics related work experience in challenging environments

  • Strong understanding of modern data modelling techniques

  • Strong understanding of Cloud services providers (AWS, Google, Microsoft) and how to architect solutions around them

  • Ability to decipher and organize large amounts of data.

  • An analytical mindset with superb communication and problem-solving skills.

  • Ability to translate complex problems clearly and in nontechnical terms.

  • In-depth SQL programming knowledge - partitioning, indexing, performance tuning knowledge, stored procedure, views

  • Hands-on experience in developing dashboards and data visualizations using BI tools (e.g.  PowerBI, Tableau)

Database Analyst - USA, Wilson - $63,800 - $90,000

Database Analyst

USA, Wilson

$63,800 - $90,000

 

Job Description

Oversees multiple database server environments to ensure an efficient, secure, and accurate environment. Gathers, organizes and interprets data to provide reports and business intelligence. Develops and maintains database documentation. Maintains current knowledge of data storage and management best practices.

 

Responsibilities:

·        High school diploma or equivalent - Required

·        Bachelor’s degree - Preferred 

·        Minimum of 2 years’ experience in a database role.

·        Extensive knowledge of Microsoft SQL Server.

·        Excellent SQL development skills.

·        Excellent communication skills.

·        Understanding of BI tools and processes.

Senior Data Engineer - USA, Remote - $110,560 to $155,840

Senior Data Engineer

USA, Remote

$110,560 to $155,840

 

Job Description

You are a driven and motivated problem solver ready to pursue meaningful work. You strive to make an impact every day & not only at work, but in your personal life and community too. If that sounds like you, then you've landed in the right place.

The Data Science AI Factory team is committed to exploring new ways to use data and analytics to solve business problems.  The team utilizes a variety of data sources, with a strong focus on unstructured and semi-structured text using NLP to enhance outcomes related to claim, underwriting, operations and the customer experience. 

As a Sr. Data Engineer, you will be an established thought leader through close partnerships with expert resources to design, develop, and implement data assets for a wide range of new initiatives across multiple lines of business. The role involves heavy data exploration, proficiency with SQL and Python, knowledge of service-based deployments and APIs, and the ability to discover and learn quickly through collaboration.  There is a need to think analytically and outside of the box while questioning current processes and continuing to build on the individual’s business acumen.

There will be a combination of team collaboration and independent work efforts.  We seek candidates with strong quantitative background and excellent analytical and problem-solving skills. This position combines business and technical skills involving interaction with business customers, data science partners, internal and external data suppliers and information technology partners.

Responsibilities

  • Identify and validate internal and external data sources for availability and quality. Work with SME’s to describe and understand data lineage and suitability for a use case.

  • Create data assets and build data pipelines that align to modern software development principles for further analytical consumption. Perform data analysis to ensure quality of data assets.

  • Create summary statistics/reports from data warehouses, marts, and operational data stores.

  • Extract data from source systems, and data warehouses, and deliver in a pre-defined format using standard database query and parsing tools.

  • Understand ways to link or compare information already in our systems with new information.

  • Perform preliminary exploratory analysis to evaluate nulls, duplicates and other issues with data sources.

  • Work with data scientists and knowledge engineers to understand the requirements and propose and identify data sources and alternatives.

  • Produce code artifacts and documentation using Github for reproducible results and hand-off to other data science teams.

  • Propose ways to improve and standardize processes to enable new data and capability assessment and to enable pivoting to new projects.

  • Understand data classification and adhere to the information protection and privacy restrictions on data.

  • Collaborate closely with data scientists, business partners, data suppliers, and IT resources.

Experience & Skills

Candidates must have the technical skills to transform, manipulate and store data, the analytical skills to relate the data to the business processes that generates it, and the communication skills to document & disseminate information regarding the availability, quality, and other characteristics of the data to a diverse audience. These varied skills may be demonstrated through the following:

  • Bachelor’s degree or equivalent experience in a related quantitative field

  • 5 + years experience accessing and retrieving data from disparate large data sources, by creating and tuning SQL queries. Understanding of data modeling concepts, data warehousing tools and databases (e.g. Oracle, AWS, Snowflake, Spark/PySpark, ETL, Big Data, and Hive) 

  • Demonstrated ability to create and deliver high quality Python code using software engineering best practices. Experience with object-oriented programming and software development a plus. Proficiency with Github and Linux highly desired.

  • Ability to analyze data sources and provide technical solutions. Strong exploratory and problem-solving skills to check for data quality issues.

  • Determine business recommendations and translate into actionable steps 

  • Self-starter with curiosity and a willingness to become a data expert

  • Demonstrate a passion to both learn new skills and lead discovery of the data research 

  • Results oriented with the ability to multi-task and adjust priorities when necessary 

  • Ability to work both independently and in a team environment with internal customers 

  • Ability to articulate and train technical concepts regarding data to both data scientists and partners

Lead MS Dynamics Technical Consultant - Bulgaria, Sofia - Лв.60,000 - Лв.110,000

Lead MS Dynamics Technical Consultant

Bulgaria, Sofia

Лв.60,000 - Лв.110,000

 

Job Description

We're looking for a proactive and creative Lead Technical Consultant for Microsoft Dynamics 365 to integrate into our growing teams working across a diverse mix of clients.

The Dynamics 365 Technical Consultant role is a unique hybrid of business and technical consultancy with the ability to operate as a technical lead focusing on the Dynamics 365. As the lead technical consultant for multiple client projects, this person is responsible for the successful onboarding and overall technological direction for those projects and works as a liaison between the client, project management and operations team to design and deliver high quality, scalable technical solutions.

The successful candidate will be able to display thought leadership, technical expertise, and the ability to manage cross-division teams from project inception to completion.

 

As part of your daily work, you will:

·        Form part of a team of highly skilled consultants - there is not much we don’t know about marketing technology implementation!

·        Demonstrate technical expertise while scoping and driving the creation of marketing and data solutions, ETL, platform configuration, APIs, integrations, and other platform developments.

·        Lead discovery workshops, gather business, marketing, and technical requirements to craft a comprehensive and scalable solution.

·        Create architecture diagrams and flow charts on both system architecture and detailed solution levels while documenting the technical solution design of the Dynamics 365 platform.

·        Own delivery tasks from initial scoping, planning and requirements analysis, through to discovery, design, development, testing, and deployment.

·        Architect data feeds, platform configuration, and components needed to run complex automated email campaigns.

·        Identify and resolve problems including client data feed issues.

·        Act as a consultant and technical expert on Dynamics 365 to serve clients’ needs.

·        The team is experienced in multiple software technologies, you will have the option to cross train and enhance your knowledge – you could expand your product portfolio with a range of other tools (Adobe Experience Cloud, Salesforce Marketing Cloud, Braze, mParticle, Bloomreach)

Azure Data Engineer - Irving, TX Full-Time, Permanent - $110,000 - $120,000

Azure Data Engineer
Irving, TX
Full-Time, Permanent
$110,000 - $120,000

Required Skills:

                                  

  • Experience in GCP/Azure, Strong Data modelling, Python, Experience with RDBMS, Big Data processing frameworks and tools (Cloudera, Sqoop, Hive, Impala, Spark), DevOps tools and techniques (e.g. continuous integration, Jenkins, Puppet, etc)

                                                        

Preferred Skills:                                     

  • Experience building/migrating data pipeline from on-prem to Cloud (GCP or any cloud)

  • Understanding of cloud technologies

  • Unix Scripting

  • Tableau and Excel tool expertise

                                                     

Job description:                                     

  • Build data pipelines to ingest data from On-prem to cloud

  • Experience with Big Data processing frameworks and tools (Cloudera, Sqoop, Hive, Impala, Spark)

  • Experience with DevOps tools and techniques (e.g continuous integration, Jenkins, Puppet, etc)

  • Experience software development on a team using Agile methodology

  • Build data standardization & transformation logic using framework following Object Oriented Programming concept

  • Write Unit Test scripts

  • Implement standardized error handling & diagnostic logging

  • Schedule and maintain production workflows on-prem as well as cloud

  • Troubleshoot and resolve QA and Production defects

  • Handle code review and code deployment

Data Engineer - USA, Remote - $98,880.00 - $148,320.00

Data Engineer

USA, Remote

$98,880.00 - $148,320.00

Job Description

Data Engineer is responsible for developing Businesses Intelligence solutions using Oracle, Power BI, and other technologies as necessary. The Business Intelligence Developer will work to productize and document solutions so that professional services personnel can efficiently deliver to customers. This person must be strong in Data Architecture, Data Warehousing and Data Transformation.

•   Analyse business requirements, functional requirements, and data specifications to develop Business Intelligence solutions that leverage Oracle, Power BI, other technologies as necessary, and data architecture standards.

•   Develop dashboards utilizing Power BI connecting to an Oracle database.

•   Develop Oracle PL/SQL procedures to perform data transformation.

•   Develop technical specifications and documentation.

•   Work directly with business users and technical staff to develop, document, and test Business Intelligence solutions.

•   Troubleshoot BI tools, systems, and software, including the timely resolution of production issues.

•   Other responsibilities as assigned.

•   Bachelor’s Degree in computer science, Business Administration or related degree along with 4 years of data warehouse/business intelligence experience.

•   A minimum of 3-5 years’ hands-on experience working with and developing Business Intelligence solutions that leverage Oracle.

•   Experience in data warehouse design and performance tuning required.

•   Experience with TypeScript/JavaScript for Power BI custom tools (preferred).

•   Experience with Python (preferred).

•   Strong PL/SQL and SQL skills.

•   Power BI skills a plus.

•   Demonstrated understanding of database architecture design patterns, and complete application development lifecycle.

•   Must possess excellent verbal and written communication skills, proven business acumen, exceptional interpersonal capabilities and possess an attention to detail