šš¼We're Nagarro.
Ā
We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (19000+ experts across 33 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in!
REQUIREMENTS:
- Bachelor's/masterās degree or equivalent experience in computer science
- Overall, 10-12 years of experience with at least 4 years of experience with Jitterbit Harmony platform and Jitterbit Cloud.
- Should have the experience to technically lead groom developers who might be geographically distributed
- Knowledge of Change & Incident Management process (JIRA etc.)
RESPONSIBILITIES:
- Responsible for end-to-end implementation of integration use case using Jitterbit platform.
- Coordinate with all the stakeholders for successful project execution.
- Responsible for requirement gathering, Integration strategy, design, implementation etc.
- Should have strong hands-on experience in designing, building, and deploying integration solution using Jitterbit harmony Platform.
- Should have developed enterprise services using REST based APIs, SOAP Web Services and use of different Jitterbit connectors (Salesforce, DB, JMS, File connector, Http/Https connectors, any TMS connector).
- Should have knowledge of Custom Jitterbit Plugins and Custom Connectors.
- Experience in Jitterbit implementations including security, logging, error handling, scalability and clustering.
- Strong experience in Jitterbit Script, XSLT and JavaScript.
- Install, configure and deploy solution using Jitterbit.
- Provide test support for bug fixes during all stages of test cycle.
- Provide support for deployment and post go-live.
- Knowledge of professional software engineering practices & best practices for the full software development life cycle including coding standards, code reviews, source control management, build processes, testing,
- Understand the requirements, create necessary documentation, give presentations to clients and get necessary approvals and create design doc for the release.
- Estimate the tasks and discuss with the clients on Risks/Issues.
- Working on the specific module independently and test the application. Code reviews suggest the team on best practices.
- Create necessary documentation, give presentations to clients and get necessary approvals.
- Broad knowledge of web standards relating to APIs (OAuth, SSL, CORS, JWT, etc.)
About Nagarro Software
šš¼We're Nagarro.
Ā
We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (19000+ experts across 33 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues.
Similar jobs
Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solutionĀ
.
Job Summary:
As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.
Role & Responsibilities:
Job Title: Senior Associate L1 ā Data Engineering
Your role is focused on Design, Development and delivery of solutions involving:
ā¢ Data Ingestion, Integration and Transformation
ā¢ Data Storage and Computation Frameworks, Performance Optimizations
ā¢ Analytics & Visualizations
ā¢ Infrastructure & Cloud Computing
ā¢ Data Management Platforms
ā¢ Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time
ā¢ Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies
2.Minimum 1.5 years of experience in Big Data technologies
3.Hands-on experience with the Hadoop stack ā HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD ā Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
7.Cloud data specialty and other related Big data technology certifications
Job Title: Senior Associate L1 ā Data Engineering
Personal Attributes:
ā¢ Strong written and verbal communication skills
ā¢ Articulation skills
ā¢ Good team player
ā¢ Self-starter who requires minimal oversight
ā¢ Ability to prioritize and manage multiple tasks
ā¢ Process orientation and the ability to define and set up processes
Your Day-to-Day
- Assist our Growth Strategists in analyzing the results of A/B experiments.
- Analysis on customer engagement rates, customer acquisition projects ( ex. Churn rate prediction, Attribution etc. )
- Analyse marketing channel performance and Deep-dive reports to stakeholders and Management
- Building statistical experimentation templates for faster A/B outputs
- Work on forecasting models and assisting senior Management in creating frameworks on growth models
- Local implementation of marketing analytics projects related to improving marketing channels effectiveness, customer segmentation, campaign optimization, etc
- Monitor campaigns against key performance indicators (KPIs), be fully aware of trends and analytics, success and risks in order to achieve business objectives.
- Communicate complex ideas into understandable reports/documentation and this will include leveraging on leading software tools such as Tableau.
Your Know-Know
- 3-5 years of experience in strategy / consulting / analytical / project management roles; experience in e-commerce, Start-ups or Unicorns(CARS24,OLA,SWIGGY,FLIPKART,OYO) or entrepreneur experience preferred + At Least 2 years of experience leading a team
- Top-notch academics from a Tier 1 college (IIM / IIT/ NIT)
- Must have SQL/PostgreSQL/Tableau Experience.Ā
- Added advantage: Experience with Google Analytics, CRM (MoEngage, Braze,Leanplum).
- Preferred knowledge of statistical computer languages (Python / R, etc).Ā
- Analytical mindset with ability to present data in a structured and informative way
- Enjoy a fast-paced environment and can align business objectives with product priorities
Requirements
Srijan Technologies is hiring for Delivery Manager - Data Science/Data Engineering
with a permanent WFH option.
Notice: Immediate joiners or candidates with 30 days of notice period are preferred.
Required Skills/Experience: ā¢ Overall Experience of 10+ years in the Industry ā¢ Good experience in Agile methodology ā¢ Experience in delivering Data engineering and data integration projects is a must ā¢ Experience in delivering Analytical programs is a must ā¢ Experience in delivering AI / ML projects will be an added advantage ā¢ Analytical skills ā¢ Well-developed interpersonal skills ā¢ Commercial awareness ā¢ Effective Communication ā¢ Team working skills ā¢ Ability to motivate people ā¢ Management and leadership skills
Ā
- StrongĀ analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
- Expertise of SQL/PL-SQL -ability to write procedures and create queries for reporting purpose.
- Must have worked on a reporting tool ā Power BI/Tableau etc.
- Strong knowledge of excel/Google Sheets ā must have worked with pivot tables, aggregate functions, logical if conditions.
- Strong verbal and written communication skills for coordination with departments.
- An analytical mind and inclination for problem-solving
1) Understand the business objectives, formulate hypotheses and collect the relevant data using SQL/R/Python. Analyse bureau, customer and lending performance data on a periodic basis to generate insights. Present complex information and data in an uncomplicated, easyto-understand way to drive action.
2) Independently Build and refit robust models for achieving game-changing growth while managing risk.
3) Identify and implement new analytical/modelling techniques to improve model performance across customer lifecycle (acquisitions, management, fraud, collections, etc.
4) Help define the data infrastructure strategy for Indian subsidiary.
a. Monitor data quality and quantity.
b. Define a strategy for acquisition, storage, retention, and retrieval of data elements. e.g.: Identify new data types and collaborate with technology teams to capture them.
c. Build a culture of strong automation and monitoring
d. Staying connected to the Analytics industry trends - data, techniques, technology, etc. and leveraging them to continuously evolve data science standards at Credit Saison.
Required Skills & Qualifications:
1) 3+ years working in data science domains with experience in building risk models. Fintech/Financial analysis experience is required.
2) Expert level proficiency in Analytical tools and languages such as SQL, Python, R/SAS, VBA etc.
3) Experience with building models using common modelling techniques (Logistic and linear regressions, decision trees, etc.)
4) Strong familiarity with Tableau//Power BI/Qlik Sense or other data visualization tools
5) Tier 1 college graduate (IIT/IIM/NIT/BITs preferred).
6) Demonstrated autonomy, thought leadership, and learning agility.
Work days- Sun-Thu
Day shift
We are building a global content marketplace that brings companies and content
creators together to scale up content creation processes across 50+ content verticals and 150+ industries. Over the past 2.5 years, weāve worked with companies like India Today, Amazon India, Adobe, Swiggy, Dunzo, Businessworld, Paisabazaar, IndiGo Airlines, Apollo Hospitals, Infoedge, Times Group, Digit, BookMyShow, UpGrad, Yulu, YourStory, and 350+ other brands.
Our mission is to become the worldās largest content creation and distribution platform for all kinds of content creators and brands.
Ā
Our Team
Ā
We are a 25+ member company and is scaling up rapidly in both team size and our ambition.
If we were to define the kind of people and the culture we have, it would be -
a) Individuals with an Extreme Sense of Passion About Work
b) Individuals with Strong Customer and Creator Obsession
c) Individuals with Extraordinary Hustle, Perseverance & Ambition
We are on the lookout for individuals who are always open to going the extra mile and thrive in a fast-paced environment. We are strong believers in building a great, enduring
a company that can outlast its builders and create a massive impact on the lives of our
employees, creators, and customers alike.
Ā
Our Investors
Ā
We are fortunate to be backed by some of the industryās most prolific angel investors - Kunal Bahl and Rohit Bansal (Snapdeal founders), YourStory Media. (Shradha Sharma); Dr. Saurabh Srivastava, Co-founder of IAN and NASSCOM; Slideshare co-founder Amit Ranjan; Indifi's Co-founder and CEO Alok Mittal; Sidharth Rao, Chairman of Dentsu Webchutney; Ritesh Malik, Co-founder and CEO of Innov8; Sanjay Tripathy, former CMO, HDFC Life, and CEO of Agilio Labs; Manan Maheshwari, Co-founder of WYSH; and Hemanshu Jain, Co-founder of Diabeto.
Backed by Lightspeed Venture Partners
Job Responsibilities:
ā Design, develop, test, deploy, maintain and improve ML models
ā Implement novel learning algorithms and recommendation engines
ā Apply Data Science concepts to solve routine problems of target users
ā Translates business analysis needs into well-defined machine learning problems, and
selecting appropriate models and algorithms
ā Create an architecture, implement, maintain and monitor various data source pipelines
that can be used across various different types of data sources
ā Monitor performance of the architecture and conduct optimization
ā Produce clean, efficient code based on specifications
ā Verify and deploy programs and systems
ā Troubleshoot, debug and upgrade existing applications
ā Guide junior engineers for productive contribution to the development
The ideal candidate must -
ML and NLP Engineer
ā 4 or more years of experience in ML Engineering
ā Proven experience in NLP
ā Familiarity with language generative model - GPT3
ā Ability to write robust code in Python
ā Familiarity with ML frameworks and libraries
ā Hands on experience with AWS services like Sagemaker and Personalize
ā Exposure to state of the art techniques in ML and NLP
ā Understanding of data structures, data modeling, and software architecture
ā Outstanding analytical and problem-solving skills
ā Team player, an ability to work cooperatively with the other engineers.
ā Ability to make quick decisions in high-pressure environments with limited information.
2. Should understand the importance and know-how of taking the machine-learning-based solution to the consumer.
3. Hands-on experience with statistical, machine-learning tools and techniques
4. Good exposure to Deep learning libraries like Tensorflow, PyTorch.
5. Experience in implementing Deep Learning techniques, Computer Vision and NLP. The candidate should be able to develop the solution from scratch with Github codes exposed.
6. Should be able to read research papers and pick ideas to quickly reproduce research in the most comfortable Deep Learning library.
7. Should be strong in data structures and algorithms. Should be able to do code complexity analysis/optimization for smooth delivery to production.
8. Expert level coding experience in Python.
9. Technologies: Backend - Python (Programming Language)
10. Should have the ability to think long term solutions, modularity, and reusability of the components.
11. Should be able to work in a collaborative way. Should be open to learning from peers as well as constantly bring new ideas to the table.
12. Self-driven missile. Open to peer criticism, feedback and should be able to take it positively. Ready to be held accountable for the responsibilities undertaken.