Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 3+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow
Similar jobs
We are looking for a passionate Solutions Architect with a diverse backend technology background to lead large engineering teams at Simform. You should have a good understanding of software architecture design, backend technologies, and cloud platforms and have experience leading a team of developers.
Role + Responsibilities:
- Designing, modifying, and testing technical architecture.
- Helping project teams with solutions architecture, troubleshooting, and technical implementation assistance.
- Occasionally working as a Tech Lead if the situation demands
- Use your expertise in different tech-stack/languages, tech architecture, etc., to drive improvements in the accuracy, quality, the efficiency of projects.
- Advocating for process improvements and helping develop solutions.
- Providing technical leadership to a team throughout the project lifecycle.
- Oversee project teams' work and provide assistance for any troubleshooting or technical implementation guidance.
- Ensure standards and best practices are implemented and practiced by large cross-technical teams.
Person Specification and Qualifications:
- Technical background with over 10 years dabbling into multiple technologies and grasping crux of them.
- Significant hands-on experience with Node/Python
- Experience in building reliable, robust, secure, and high-performance in-production software systems with a large user base.
- Strong understanding of Cloud (Preferably AWS) and conceptual understanding of CI/CD, and DevOps.
- Strong interpersonal skills including mentoring, coaching, collaborating, and team building.
- Creative approach to problem-solving with the ability to focus on details while maintaining the “big picture” view
- Exposure and expertise in design patterns and software design patterns in general
● Create and maintain optimal data pipeline architecture.
● Assemble large, complex data sets that meet functional / non-functional
business requirements.
● Building and optimizing ‘big data’ data pipelines, architectures and data sets.
● Maintain, organize & automate data processes for various use cases.
● Identifying trends, doing follow-up analysis, preparing visualizations.
● Creating daily, weekly and monthly reports of product KPIs.
● Create informative, actionable and repeatable reporting that highlights
relevant business trends and opportunities for improvement.
Required Skills And Experience:
● 2-5 years of work experience in data analytics- including analyzing large data sets.
● BTech in Mathematics/Computer Science
● Strong analytical, quantitative and data interpretation skills.
● Hands-on experience with Python, Apache Spark, Hadoop, NoSQL
databases(MongoDB preferred), Linux is a must.
● Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
● Experience with Google Cloud Data Analytics Products such as BigQuery, Dataflow, Dataproc etc. (or similar cloud-based platforms).
● Experience working within a Linux computing environment, and use of
command-line tools including knowledge of shell/Python scripting for
automating common tasks.
● Previous experience working at startups and/or in fast-paced environments.
● Previous experience as a data engineer or in a similar role.
Key Responsibilities:
Taking ownership of customer issues reported and seeing problems through to resolution
Understand, interpret, reproduce, and diagnose issues reported by the customers.
Researching, troubleshooting and identifying solutions to resolve product issues
Should be able to handle voice calls, emails and online web sessions with the customer as part of technical troubleshooting
Should exhibit patience and empathy while working with customer with an aim to drive positive customer experience
Following standard procedures for proper escalation of unresolved issues to the appropriate internal teams
Following standard procedures for submitting tickets to the engineering team for further investigation of unresolved issues
Contributing actively towards knowledge base articles
Adherence to strict SLA’s
Ready to work in early morning, general and afternoon shifts, including weekends, on rotation basis
Should demonstrate an aptitude and appetite for learning newer technologies while expanding on the core knowledge
The engineer should be available to work in the late evening shift (4pm - 1am) if required.
Primary skills:
3-8 years of relevant experience.
Strong technical knowledge on:
- Cloud Technologies – AWS, Azure etc.
- Databases – SQL, Oracle, MySQL etc.
- Operating Systems – Linux, Windows etc
- Networking – Basic networking concepts and troubleshooting
- Programming knowledge – Java, Python (Desirable, not a must has) o Prior experience with REST and SOAP calls.
Excellent communication skills – English written and verbal
HTTP technology and principles, including REST and SOAP principles (Required)
JSON & XML Data formats (required) Javascript Regular Expression (Good to have)
Good understanding of networking protocols and applications (TCP/IP, proxies, load balancing, firewalls, etc.) (Required)
Working knowledge of database technologies and SQL (Required)
In-depth familiarity of Linux (Required) (advanced user; sysadmin experience a bonus, but not required)
Strong analytical and logical reasoning for technical troubleshooting
Ability to collaborate with cross-functional teams – Dev, QA, Infrastructure teams etc.
Should be a team player who keeps team’s success before individual achievements
Knowledge on data integration (Informatica/ Mulesoft)
We are looking for an exceptionally talented Lead data engineer who has exposure in implementing AWS services to build data pipelines, api integration and designing data warehouse. Candidate with both hands-on and leadership capabilities will be ideal for this position.
Qualification: At least a bachelor’s degree in Science, Engineering, Applied Mathematics. Preferred Masters degree
Job Responsibilities:
• Total 6+ years of experience as a Data Engineer and 2+ years of experience in managing a team
• Have minimum 3 years of AWS Cloud experience.
• Well versed in languages such as Python, PySpark, SQL, NodeJS etc
• Has extensive experience in Spark ecosystem and has worked on both real time and batch processing
• Have experience in AWS Glue, EMR, DMS, Lambda, S3, DynamoDB, Step functions, Airflow, RDS, Aurora etc.
• Experience with modern Database systems such as Redshift, Presto, Hive etc.
• Worked on building data lakes in the past on S3 or Apache Hudi
• Solid understanding of Data Warehousing Concepts
• Good to have experience on tools such as Kafka or Kinesis
• Good to have AWS Developer Associate or Solutions Architect Associate Certification
• Have experience in managing a team
Operates in over 25 countries across six continents and is part of Publicis Media, one of four solution hubs within Publicis Groupe, which is present in over 100 countries and employs nearly 80,000 professionals.
We believe there are better ways for brands to connect with people. And we’re on a mission to guide brands to better connections -- across consumers, channels and partners. These are just some of the services we offer our clients in our quest to deliver ambitious outcomes.
Skills Required:
- Servlet and JSP development
- CSS, JavaScript, HTML
- AJAX, jQuery, EXTJS
- OSGi/FELIX
- Web services creation and consumption
- CMS development experience
- Java Content Repository (JCR)/CRX
- Eclipse IDE
- Maven
- SVN
- Jenkins
- Artifactory
- Apache Sling
- Lucene
- Tomcat/JBoss
- Apache Web Server
Senior Team Lead, Software Engineering (96386)
Role: Senior Team Lead
Skills: Has to be an expert in these -
- Java
- Microservices
- Hadoop
- People Management Skills.
Will be a plus if knowledge on -
AWS
Location: Bangalore India – North Gate.
3+ years of industry experience designing, building and supporting cloud-native systems in production with a solid grasp on good software engineering practices such as code reviews, a deep focus on quality, and documentation
- Demonstrable backend development expertise using Java (JDK 8+, 11 preferred)
- Proven Experience writing reactive, event-driven, asynchronous code using Vert.x
- Fair understanding of networks, security, system resilience & clustering • Experience with MongoDB & ELK stack
- Experience building & consuming RESTful APIs • Experience writing Unit & Integration tests with JUnit5 / Jupiter
- Ability to troubleshoot and effectively resolve issues across services and multiple levels of a complex technical stack that includes microservices, container & virtualized environments, and message & event streams, in a timely manner.
- A desire to learn and apply best practices in software development
- Good communication, and the ability to work remotely with minimal supervision
Nice to have skills:
- Oracle database experience a plus
- Experience with Apache Kafka (or Java Messaging Services in general) & Nifi
- Experience with containers & orchestration – Docker, Docker Swarm, Kubernetes
- Experience with CI / CD – Jenkins, Nexus, AWS and / or Azure Cloud infra
The IT Infrastructure Analyst is responsible for working on assigned projects and technical initiatives in an efficient/effective manner. The IT Infrastructure Analyst must possess a high degree of technical knowledge regarding server & storage systems and associated technology tools for managing them.
The individual is expected to be able to manage and support infrastructure systems as necessary, perform troubleshooting, build new systems, and monitor data center environments.
Must have:
8-12 years’ experience in IT infrastructure management
Expertise in Dell Server Environments (Dell Hardware & Management system - Open Manage)
Expertise in VMWare (Vcentre setup and management)
Expertise in Windows server and storage
Expertise in EMC (now Dell) SAN and NAS devices
Good Linux skills (basic troubleshooting skills on Linux servers CentOS)
Good networking skills (troubleshooting skills)
Familiarity with Data Center Environmentals (Power, UPS, Cooling etc.)
Good to have:
- Experience with hyper-converged infrastructure such as Nutanix
- Experience and familiarity with Dell networking technology.
- Experience and familiarity with AWS
C++ or Java developer expertise