keywords/skills

Showing page 12 of 54 (10 in 0.9 seconds)

  • London
  • £50,000 - £75,000 per annum
    • Permanent
  • 10 Mar 2020

Data Engineer

London

£50,000 - £75,000

Fantastic opportunity for a Data Engineer to become an integral part of the clients newly formed Data team, reporting to the Head of Data, you will be responsible for shaping the technologies, and long term success of the Data function of the business.

This role will suit an ambitious Data Engineer who would like to down the line lead a team of Data specialists and elect the best technologies for the client to use.

What skills do you need?

  • AWS/ Azure / GCP
  • Python, Java or Scala
  • ETL
  • Hadoop and Spark

As mentioned, this is still early days for the clients entrance into the Big Data and analytics space, so there is no real wish list of skills. They are looking for a candidate with a proven track record of adapting to new technologies and a candidate to discuss ideas with Senior Stakeholders.

The client offers excellent career opportunities with the ability to carry out ongoing skills development and certifications.

Don't hesitate to contact me to hear more regarding this role:

  • London
  • £50,000 - £75,000 per annum
    • Permanent
  • 10 Mar 2020

Data Engineer

London

£50,000 - £75,000

Fantastic opportunity for a Data Engineer to become an integral part of the clients newly formed Data team, reporting to the Head of Data, you will be responsible for shaping the technologies, and long term success of the Data function of the business.

This role will suit an ambitious Data Engineer who would like to down the line lead a team of Data specialists and elect the best technologies for the client to use.

What skills do you need?

  • AWS/ Azure / GCP
  • Python, Java or Scala
  • ETL
  • Hadoop and Spark

As mentioned, this is still early days for the clients entrance into the Big Data and analytics space, so there is no real wish list of skills. They are looking for a candidate with a proven track record of adapting to new technologies and a candidate to discuss ideas with Senior Stakeholders.

The client offers excellent career opportunities with the ability to carry out ongoing skills development and certifications.

Don't hesitate to contact me to hear more regarding this role:

  • London
  • £60,000 - £75,000 per annum
    • Permanent
  • 10 Mar 2020

Data Architect
London
£60,000 - £75,000

This is a great opportunity for a Data Architect to join a greenfield project & a team who operate right across a high-successful, market-leading platform. This role offers a unique opportunity to learn and develop your skills whilst moving the business forward with future data initiatives and lead the development of the Data Warehouse

THE COMPANY:

The company are a market leading auctioning platform transforming the market for buying and selling goods. They are listed in the Top 100 most innovative companies in Europe and operate a highly-profitable, unique product.

THE ROLE - DATA ARCHITECT:

  • Build and maintain the global data warehouse infrastructure.
  • Full responsibility/awareness of and integration with the data warehouse development process.
  • Scrutiny of new dimensions/metrics/processes prior to release.
  • Manage a team of offshore developers.

YOUR SKILLS AND EXPERIENCE:

  • A solid foundation in data warehousing: SSIS; Data Modelling methodologies etc.
  • Experience with Data Engineering, SQL Server Reporting and Analysis Services
  • Experience with data warehousing (preferably Azure OR AWS OR GCP)
  • Excellent communication skills
  • Excellent ETL experience

HOW TO APPLY:

To apply, please do so via this site. For more information, reach out to Liam Wilson.

KEYWORDS:

SQL Server, SSIS, Data Warehousing, Kimball, Hadoop, Spark, Big Data, Data Developer, Financial Services, GCP, Azure, AWS

  • London
  • £60,000 - £75,000 per annum
    • Permanent
  • 10 Mar 2020

Data Architect
London
£60,000 - £75,000

This is a great opportunity for a Data Architect to join a greenfield project & a team who operate right across a high-successful, market-leading platform. This role offers a unique opportunity to learn and develop your skills whilst moving the business forward with future data initiatives and lead the development of the Data Warehouse

THE COMPANY:

The company are a market leading auctioning platform transforming the market for buying and selling goods. They are listed in the Top 100 most innovative companies in Europe and operate a highly-profitable, unique product.

THE ROLE - DATA ARCHITECT:

  • Build and maintain the global data warehouse infrastructure.
  • Full responsibility/awareness of and integration with the data warehouse development process.
  • Scrutiny of new dimensions/metrics/processes prior to release.
  • Manage a team of offshore developers.

YOUR SKILLS AND EXPERIENCE:

  • A solid foundation in data warehousing: SSIS; Data Modelling methodologies etc.
  • Experience with Data Engineering, SQL Server Reporting and Analysis Services
  • Experience with data warehousing (preferably Azure OR AWS OR GCP)
  • Excellent communication skills
  • Excellent ETL experience

HOW TO APPLY:

To apply, please do so via this site. For more information, reach out to Liam Wilson.

KEYWORDS:

SQL Server, SSIS, Data Warehousing, Kimball, Hadoop, Spark, Big Data, Data Developer, Financial Services, GCP, Azure, AWS

  • City Of London
  • £60,000 - £80,000 per annum
    • Permanent
  • 10 Mar 2020

I am engaged with a leading organisation in the Insurance space who are looking to recruit experienced Data Engineers into their growing 'Cloud & Data' function, based out of their HQ in London (5 minutes walk from the Gherkin).

The organisation are going through a huge recruitment drive in 2020, focusing on improving the way they utilise data and you'll be pleased to know that unlike a lot of non-tech companies, they view technology as an enabler rather than a blocker.

The roles in question are focused primarily on the design and development of a strategic analytical platform (AWS) to enable integration, analysis and insight of a variety of data sets, as well as the development of data and analytics solutions for machine learning and data visualisation. Moreover, you will be expected to develop and manage relationships across data, technology and business functions (think stakeholder management), challenging existing paradigms in a constructive manner that demonstrate and promote the value of a mature data analytics capability.

Fundamentally, you will be given the opportunity to drive a data-first mentality within the organisation, a unique and rewarding challenge that is usually quite rare to find in established organisation such as this one.

In terms of their expectations… if you can demonstrate some of the below skills then you stand a good chance of being shortlisted to the interview stage:

  • Excellent understanding of Data Architecture, including both on-premise and via the AWS Cloud.
  • Strong experience with Data Modelling and Data Analysis, including the ability to query, prepare and analyse Data sets.
  • Proven commercial experience with both SQL and NoSQL databases.
  • Skilled in software engineering, with experience in some (not necessarily all) of the following programming languages: Python, R, Java, Scala.
  • Familiarity with Linux/UNIX and tools including Git and SVN.
  • Good understanding of Data Science models and Data Modelling in general (e.g. Star Schema, Kimball).
  • Some commercial experience with technologies in the Big Data space (e.g. Cassandra, Spark, Kafka, Hadoop, Kibana etc.).

In return, you will get the opportunity to work on cutting-edge projects in an Agile environment.

Moreover, I have personally worked with these guys for years and they have an excellent company culture, focusing on tailored personal development, genuine career progression and flexibility, whether it be home working opportunities or simply flexible working hours. They also offer a generous bonus/benefits package as well as the chance to get huge discounts on car, travel and home insurance.

So if you like the sound of the role and would like to find out more then please contact me via the following:

  • Mobile:
  • Office Line:
  • Email:
  • LinkedIn: 'Liam Haghighat'

Anything discussed will remain completely confidential and fully compliant with GDPR.

  • City Of London
  • £60,000 - £80,000 per annum
    • Permanent
  • 10 Mar 2020

I am engaged with a leading organisation in the Insurance space who are looking to recruit experienced Data Engineers into their growing 'Cloud & Data' function, based out of their HQ in London (5 minutes walk from the Gherkin).

The organisation are going through a huge recruitment drive in 2020, focusing on improving the way they utilise data and you'll be pleased to know that unlike a lot of non-tech companies, they view technology as an enabler rather than a blocker.

The roles in question are focused primarily on the design and development of a strategic analytical platform (AWS) to enable integration, analysis and insight of a variety of data sets, as well as the development of data and analytics solutions for machine learning and data visualisation. Moreover, you will be expected to develop and manage relationships across data, technology and business functions (think stakeholder management), challenging existing paradigms in a constructive manner that demonstrate and promote the value of a mature data analytics capability.

Fundamentally, you will be given the opportunity to drive a data-first mentality within the organisation, a unique and rewarding challenge that is usually quite rare to find in established organisation such as this one.

In terms of their expectations… if you can demonstrate some of the below skills then you stand a good chance of being shortlisted to the interview stage:

  • Excellent understanding of Data Architecture, including both on-premise and via the AWS Cloud.
  • Strong experience with Data Modelling and Data Analysis, including the ability to query, prepare and analyse Data sets.
  • Proven commercial experience with both SQL and NoSQL databases.
  • Skilled in software engineering, with experience in some (not necessarily all) of the following programming languages: Python, R, Java, Scala.
  • Familiarity with Linux/UNIX and tools including Git and SVN.
  • Good understanding of Data Science models and Data Modelling in general (e.g. Star Schema, Kimball).
  • Some commercial experience with technologies in the Big Data space (e.g. Cassandra, Spark, Kafka, Hadoop, Kibana etc.).

In return, you will get the opportunity to work on cutting-edge projects in an Agile environment.

Moreover, I have personally worked with these guys for years and they have an excellent company culture, focusing on tailored personal development, genuine career progression and flexibility, whether it be home working opportunities or simply flexible working hours. They also offer a generous bonus/benefits package as well as the chance to get huge discounts on car, travel and home insurance.

So if you like the sound of the role and would like to find out more then please contact me via the following:

  • Mobile:
  • Office Line:
  • Email:
  • LinkedIn: 'Liam Haghighat'

Anything discussed will remain completely confidential and fully compliant with GDPR.

  • Peterborough
  • £60,000 - £75,000 per annum, inc benefits
    • Permanent
  • 10 Mar 2020

Global Manufacturer require a Data Engineer to work with developers, database architects / data scientists on strategic data initiatives. You will be responsible for expanding and optimizing data and pipeline architecture, as well as data flow/collection for cross functional teams. Uou will be an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.

Client Details

Global Manufacturer / Distributor

Description

Global Manufacturer are looking for a Data Engineer to work with software developers, database architects, data analysts and data scientists on strategic data initiatives. The Data Engineer will be responsible for expanding and optimizing data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.

As Data Engineer you will be responsible for:

  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL/Non-SQL and Azure 'Big Data' technologies
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition
  • operational efficiency and other key business performance metrics
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs
  • Keep our data separated and secure across national boundaries through multiple data centres and Azure regions
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.
  • For developing, constructing, maintenance and testing of data architectures, such as databases and large-scale processing systems
  • Clean, aggregate, and organize data from disparate sources and transfer it to data storage for analysis. Making sure the companies data technology is operating at its peak results in massive improvements to performance, cost, or both
  • Owning the stability of new products designed, including the on-going robustness, resilience and stability of these products
  • Building and maintaining custom ingestion pipelines

Experienced using the following software/tools:

  • Big Data tools: Hadoop, Spark, Kafka, etc.
  • Relational SQL and NoSQL databases, including Postgres and Cassandra
  • Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc
  • AWS cloud services: EC2, EMR, RDS, Redshift
  • Stream-processing systems: Storm, Spark-Streaming
  • Object-oriented/object function scripting languages: Python, Java, C++, Scala
  • Data Integration / Archiving / Relational Databases and Data Warehousing
  • MS SQL, Logic Apps, Function Apps, SSIS, SSRS
  • Microsoft Azure DevOps
  • Data Visualisation

Profile

  • Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing 'Big Data' data pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable 'Big Data' data stores.
  • Strong project management and organizational skills.

Experienced using the following software/tools:

  • Big Data tools: Hadoop, Spark, Kafka
  • Relational SQL and NoSQL databases, including Postgres and Cassandra
  • Data pipeline and workflow management tools: Azkaban, Luigi, Airflow
  • AWS cloud services: EC2, EMR, RDS, Redshift
  • Stream-processing systems: Storm, Spark-Streaming
  • Object-oriented/object function scripting languages: Python, Java, C++, Scala
  • Data Integration / Archiving / Relational Databases and Data Warehousing
  • MS SQL, Logic Apps, Function Apps, SSIS, SSRS
  • Microsoft Azure DevOps
  • Data Visualisation

Job Offer

Opportunity to join a rapidly expanding team

Opportunity to work in an advanced analytics function

  • Peterborough
  • £60,000 - £75,000 per annum, inc benefits
    • Permanent
  • 10 Mar 2020

Global Manufacturer require a Data Engineer to work with developers, database architects / data scientists on strategic data initiatives. You will be responsible for expanding and optimizing data and pipeline architecture, as well as data flow/collection for cross functional teams. Uou will be an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.

Client Details

Global Manufacturer / Distributor

Description

Global Manufacturer are looking for a Data Engineer to work with software developers, database architects, data analysts and data scientists on strategic data initiatives. The Data Engineer will be responsible for expanding and optimizing data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.

As Data Engineer you will be responsible for:

  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL/Non-SQL and Azure 'Big Data' technologies
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition
  • operational efficiency and other key business performance metrics
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs
  • Keep our data separated and secure across national boundaries through multiple data centres and Azure regions
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.
  • For developing, constructing, maintenance and testing of data architectures, such as databases and large-scale processing systems
  • Clean, aggregate, and organize data from disparate sources and transfer it to data storage for analysis. Making sure the companies data technology is operating at its peak results in massive improvements to performance, cost, or both
  • Owning the stability of new products designed, including the on-going robustness, resilience and stability of these products
  • Building and maintaining custom ingestion pipelines

Experienced using the following software/tools:

  • Big Data tools: Hadoop, Spark, Kafka, etc.
  • Relational SQL and NoSQL databases, including Postgres and Cassandra
  • Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc
  • AWS cloud services: EC2, EMR, RDS, Redshift
  • Stream-processing systems: Storm, Spark-Streaming
  • Object-oriented/object function scripting languages: Python, Java, C++, Scala
  • Data Integration / Archiving / Relational Databases and Data Warehousing
  • MS SQL, Logic Apps, Function Apps, SSIS, SSRS
  • Microsoft Azure DevOps
  • Data Visualisation

Profile

  • Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing 'Big Data' data pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable 'Big Data' data stores.
  • Strong project management and organizational skills.

Experienced using the following software/tools:

  • Big Data tools: Hadoop, Spark, Kafka
  • Relational SQL and NoSQL databases, including Postgres and Cassandra
  • Data pipeline and workflow management tools: Azkaban, Luigi, Airflow
  • AWS cloud services: EC2, EMR, RDS, Redshift
  • Stream-processing systems: Storm, Spark-Streaming
  • Object-oriented/object function scripting languages: Python, Java, C++, Scala
  • Data Integration / Archiving / Relational Databases and Data Warehousing
  • MS SQL, Logic Apps, Function Apps, SSIS, SSRS
  • Microsoft Azure DevOps
  • Data Visualisation

Job Offer

Opportunity to join a rapidly expanding team

Opportunity to work in an advanced analytics function

  • London
  • £80,000 - £900,000 per annum
    • Permanent
  • 10 Mar 2020

Engineering Manager

£80,000 - £90,000 + Benefits

London



The Company

Join a leading travel technology company who are using Big Data at the forefront of their operations with endless amounts of data waiting to be utilised

The Role

As an engineering manager you will be working in an agile environment to ensure product are delivered in line with business needs

Responsibilities include:

  • Mentor and manage a team of 7 engineers
  • Contribute to the long-term vision and technical roadmap
  • Apply the appropriate standards and principles when delivering data products
  • Ensure code is optimised and suggest configuration changes to improve performance

Skills and Requirements:

To qualify for this Engineering Manager role, you will need:

  • Commercial experience managing or leading a team
  • Prior experience working with Java, DevOps (Docker, Kubernetes), Amazon Web Services (AWS)
  • Strong understanding of Engineering architecture

HOW TO APPLY:

Please register your interest by sending your CV to Sean via the Apply link on this page.

  • London
  • £80,000 - £900,000 per annum
    • Permanent
  • 10 Mar 2020

Engineering Manager

£80,000 - £90,000 + Benefits

London



The Company

Join a leading travel technology company who are using Big Data at the forefront of their operations with endless amounts of data waiting to be utilised

The Role

As an engineering manager you will be working in an agile environment to ensure product are delivered in line with business needs

Responsibilities include:

  • Mentor and manage a team of 7 engineers
  • Contribute to the long-term vision and technical roadmap
  • Apply the appropriate standards and principles when delivering data products
  • Ensure code is optimised and suggest configuration changes to improve performance

Skills and Requirements:

To qualify for this Engineering Manager role, you will need:

  • Commercial experience managing or leading a team
  • Prior experience working with Java, DevOps (Docker, Kubernetes), Amazon Web Services (AWS)
  • Strong understanding of Engineering architecture

HOW TO APPLY:

Please register your interest by sending your CV to Sean via the Apply link on this page.