bachelor’s degree or equivalent experience
● Knowledge of database fundamentals and fluency in advanced SQL, including concepts
such as windowing functions
● Knowledge of popular scripting languages for data processing such as Python, as well as
familiarity with common frameworks such as Pandas
● Experience building streaming ETL pipelines with tools such as Apache Flink, Apache
Beam, Google Cloud Dataflow, DBT and equivalents
● Experience building batch ETL pipelines with tools such as Apache Airflow, Spark, DBT, or
custom scripts
● Experience working with messaging systems such as Apache Kafka (and hosted
equivalents such as Amazon MSK), Apache Pulsar
● Familiarity with BI applications such as Tableau, Looker, or Superset
● Hands on coding experience in Java or Scala
About fintech
Similar jobs
Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.
Role & Responsibilities:
Job Title: Senior Associate L1 – Data Engineering
Your role is focused on Design, Development and delivery of solutions involving:
• Data Ingestion, Integration and Transformation
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies
2.Minimum 1.5 years of experience in Big Data technologies
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
7.Cloud data specialty and other related Big data technology certifications
Job Title: Senior Associate L1 – Data Engineering
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes
Design, implement, and improve the analytics platform
Implement and simplify self-service data query and analysis capabilities of the BI platform
Develop and improve the current BI architecture, emphasizing data security, data quality
and timeliness, scalability, and extensibility
Deploy and use various big data technologies and run pilots to design low latency
data architectures at scale
Collaborate with business analysts, data scientists, product managers, software development engineers,
and other BI teams to develop, implement, and validate KPIs, statistical analyses, data profiling, prediction,
forecasting, clustering, and machine learning algorithms
Educational
At Ganit we are building an elite team, ergo we are seeking candidates who possess the
following backgrounds:
7+ years relevant experience
Expert level skills writing and optimizing complex SQL
Knowledge of data warehousing concepts
Experience in data mining, profiling, and analysis
Experience with complex data modelling, ETL design, and using large databases
in a business environment
Proficiency with Linux command line and systems administration
Experience with languages like Python/Java/Scala
Experience with Big Data technologies such as Hive/Spark
Proven ability to develop unconventional solutions, sees opportunities to
innovate and leads the way
Good experience of working in cloud platforms like AWS, GCP & Azure. Having worked on
projects involving creation of data lake or data warehouse
Excellent verbal and written communication.
Proven interpersonal skills and ability to convey key insights from complex analyses in
summarized business terms. Ability to effectively communicate with multiple teams
Good to have
AWS/GCP/Azure Data Engineer Certification
We are looking for a Business Intelligence (BI)/Data Analyst to create and manage Power Bl and analytics solutions that turn data into knowledge. In this role, you should have a background in data and business analysis. If you are self-directed, passionate about data,
and have business acumen and problem-solving aptitude, we'd like to meet you. Ultimately, you will enhance our business intelligence system to help us make better decisions.
Requirements and Qualifications
- BSc/BA in Computer Science, Engineering, or relevant field.
- Financial experience and Marketing background is a plus
- Strong Power BI development skills including Migration of existing deliverables to PowerBl.
- Ability to work autonomously
- Data modelling, Calculations, Conversions, Scheduling Data refreshes in Power-BI.
- Proven experience as a Power BI Developer is a must.
- Industry experience is preferred. Familiarity with other BI tools (Tableau, QlikView).
- Analytical mind with a problem-solving aptitude.
Responsibilities
- Design, develop and maintain business intelligence solutions
- Craft and execute queries upon request for data
- Present information through reports and visualization based on requirements gathered from stakeholders
- Interact with the team to gain an understanding of the business environment, technical context, and organizational strategic direction
- Design, build and deploy new, and extend existing dashboards and reports that synthesize distributed data sources
- Ensure data accuracy, performance, usability, and functionality requirements of BI platform
- Manage data through MS Excel, Google sheets, and SQL applications, as required and support other analytics platforms
- Develop and execute database queries and conduct analyses
- Develop and update technical documentation requirements
- Communicate insights to both technical and non-technical audiences.
We are looking for candidates who have demonstrated both a strong business sense and deep understanding of the quantitative foundations of modelling.
• Excellent analytical and problem-solving skills, including the ability to disaggregate issues, identify root causes and recommend solutions
• Statistical programming software experience in SPSS and comfortable working with large data sets.
• R, Python, SAS & SQL are preferred but not a mandate
• Excellent time management skills
• Good written and verbal communication skills; understanding of both written and spoken English
• Strong interpersonal skills
• Ability to act autonomously, bringing structure and organization to work
• Creative and action-oriented mindset
• Ability to interact in a fluid, demanding and unstructured environment where priorities evolve constantly, and methodologies are regularly challenged
• Ability to work under pressure and deliver on tight deadlines
Qualifications and Experience:
• Graduate degree in: Statistics/Economics/Econometrics/Computer
Science/Engineering/Mathematics/MBA (with a strong quantitative background) or
equivalent
• Strong track record work experience in the field of business intelligence, market
research, and/or Advanced Analytics
• Knowledge of data collection methods (focus groups, surveys, etc.)
• Knowledge of statistical packages (SPSS, SAS, R, Python, or similar), databases,
and MS Office (Excel, PowerPoint, Word)
• Strong analytical and critical thinking skills
• Industry experience in Consumer Experience/Healthcare a plus
- Develop, train, and optimize machine learning models using Python, ML algorithms, deep learning frameworks (e.g., TensorFlow, PyTorch), and other relevant technologies.
- Implement MLOps best practices, including model deployment, monitoring, and versioning.
- Utilize Vertex AI, MLFlow, KubeFlow, TFX, and other relevant MLOps tools and frameworks to streamline the machine learning lifecycle.
- Collaborate with cross-functional teams to design and implement CI/CD pipelines for continuous integration and deployment using tools such as GitHub Actions, TeamCity, and similar platforms.
- Conduct research and stay up-to-date with the latest advancements in machine learning, deep learning, and MLOps technologies.
- Provide guidance and support to data scientists and software engineers on best practices for machine learning development and deployment.
- Assist in developing tooling strategies by evaluating various options, vendors, and product roadmaps to enhance the efficiency and effectiveness of our AI and data science initiatives.
ADF Developer with top Conglomerates for Kochi location_ Air India
conducting F2F Interviews on 22nd April 2023
Experience - 2-12 years.
Location - Kochi only (work from the office only)
Notice period - 1 month only.
If you are interested, please share the following information at your earliest
-
Understand long-term and short-term business requirements to precision match it with the capabilities of different distributed storage and computing technologies from the plethora of options available in the ecosystem.
-
Create complex data processing pipelines
-
Design scalable implementations of the models developed by our Data Scientist.
-
Deploy data pipelines in production systems based on CICD practices
-
Create and maintain clear documentation on data models/schemas as well as
transformation/validation rules
-
Troubleshoot and remediate data quality issues raised by pipeline alerts or downstream consumers
Azure – Data Engineer
- At least 2 years hands on experience working with an Agile data engineering team working on big data pipelines using Azure in a commercial environment.
- Dealing with senior stakeholders/leadership
- Understanding of Azure data security and encryption best practices. [ADFS/ACLs]
Data Bricks –experience writing in and using data bricks Using Python to transform, manipulate data.
Data Factory – experience using data factory in an enterprise solution to build data pipelines. Experience calling rest APIs.
Synapse/data warehouse – experience using synapse/data warehouse to present data securely and to build & manage data models.
Microsoft SQL server – We’d expect the candidate to have come from a SQL/Data background and progressed into Azure
PowerBI – Experience with this is preferred
Additionally
- Experience using GIT as a source control system
- Understanding of DevOps concepts and application
- Understanding of Azure Cloud costs/management and running platforms efficiently
The Job
The Architect, Machine Learning and Artificial Intelligence including Computer Vision will grow and lead a team of talented Machine Learning (ML), Computer Vision (CV) and Artificial Intelligence (AI) researchers and engineers to develop innovative machine learning algorithms, scalable ML system, and AI applications for Racetrack. This role will be focused on developing and deploying personalization and recommender system, search, experimentation, audience, and content AI solutions to drive user experience and growth.
The Daily
- Develop innovative data science solutions that utilize machine learning and deep learning algorithms, statistical and quantitative modelling approaches to support product, engineering, content, and marketing initiatives.
- Build and lead a world-class team of ML and AI scientists and engineers.
- Be a hands-on leader to mentor the team in latest machine learning and deep learning approaches, and to introduce new technologies and processes. Single headedly manage the MVP and PoCs
- Work with ML engineers to design solution architecture and develop scalable machine learning system to accelerate learning cycle.
- Identify data science opportunities that deliver business value.
- Develop ML/AI/CV roadmap and educate both internal and external stakeholders at all levels to drive implementation and measurement.
- Hands on experience in Image processing for auto industry
- BFSI domain knowledge is a plus
- Provide thought leadership to enable ML/AI applications.
- Manage products priorities and ensure timely delivery.
- Develop and evangelize best practices for scoping, building, validating, deploying, and monitoring ML/AI products.
- Prepare and present ML modelling results and analytical insights that help drive the business to senior leadership.
The Essentials
- 8 + years of work experience in Machine Learning, AI and Data Science with a proven track record to drive innovation and business impacts
- 4 + years of managing a team of data scientists, ML and AI researchers and engineers
- Strong machine learning, deep learning, and statistical modelling expertise, such as causal inference modelling, ensembles, neural networks, reinforcement learning, NLP, and computer vision
- Advanced knowledge of SQL and experience with big data platform (AWS, Snowflake, Spark, Google Cloud etc.)
- Proficiency in machine learning and deep learning languages and platforms (Python, R, TensorFlow, Keras, PyTorch, MXNet etc.)
- Experience in deploying machine learning algorithms and advanced modelling solutions
- Experience in developing advanced analytics and ML infrastructure and system
- Self-starter and self-motivated with the proven ability to deliver results in a fast-paced, high-energy environment
- Strong communication skills and the ability to explain complex analysis and algorithms to non-technical audience
- Works effectively cross functional teams to build trusted partnership
- Working experience in digital media and entertainment industry preferred
- Experience with Agile methodologies preferred