The Sr. Analytics Engineer would provide technical expertise in needs identification, data modeling, data movement, and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective.
Understands and leverages best-fit technologies (e.g., traditional star schema structures, cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges.
Provides data understanding and coordinates data-related activities with other data management groups such as master data management, data governance, and metadata management.
Actively participates with other consultants in problem-solving and approach development.
Responsibilities :
Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs.
Perform data analysis to validate data models and to confirm the ability to meet business needs.
Assist with and support setting the data architecture direction, ensuring data architecture deliverables are developed, ensuring compliance to standards and guidelines, implementing the data architecture, and supporting technical developers at a project or business unit level.
Coordinate and consult with the Data Architect, project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels.
Work closely with Business Analysts and Solution Architects to design the data model satisfying the business needs and adhering to Enterprise Architecture.
Coordinate with Data Architects, Program Managers and participate in recurring meetings.
Help and mentor team members to understand the data model and subject areas.
Ensure that the team adheres to best practices and guidelines.
Requirements :
- Strong working knowledge of at least 3 years of Spark, Java/Scala/Pyspark, Kafka, Git, Unix / Linux, and ETL pipeline designing.
- Experience with Spark optimization/tuning/resource allocations
- Excellent understanding of IN memory distributed computing frameworks like Spark and its parameter tuning, writing optimized workflow sequences.
- Experience of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., Redshift, Bigquery, Cassandra, etc).
- Familiarity with Docker, Kubernetes, Azure Data Lake/Blob storage, AWS S3, Google Cloud storage, etc.
- Have a deep understanding of the various stacks and components of the Big Data ecosystem.
- Hands-on experience with Python is a huge plus
Similar jobs
Person should have strong knowledge in -
1. Core Java,
2. JSP
3. Spring Framework,
4. Spring Boot,
5. SOAP and REST Webservices
6. Application Security.
Person should have 4 to 6 Years of work experience in Java and the above-mentioned related technology.
Person should have good Oracle Database knowledge and have good communication skills.
Having work experience in BFSI domain will added advantage
Person needs to work from Mumbai location and also available to join the office
• As a Python full-stack developer, your role would involve design development and deploy full-stack applications out of Artificial intelligence projects with a focus on low latency and scalability.
• You also need to optimize the application for better performance and a large number of concurrent users.
• A strong technologist we care about doing things the right way rather than just doing them and thrives in a complex and challenging environment.
Who are we looking for?
• Bachelors / Masters in Computer Science or equivalent with at least 3+ years of professional experience.
• Solid understanding of design patterns, data structures, and advanced programming techniques
• As an Engineer in our team, you will design, code, test, and debug quality software programs.
• Strong software design and architectural skills in object-oriented and functional programming styles.
• Python, Celery, RabbitMQ, Kafka, Multithreading, Async, Microservices, Docker, Kubernetes.
• Experience in working with Machine Learning Pipelines
• Experience in Reactjs.
• Experience in Celery, RabbitMQ/Kafka.
• Experience in Unit Testing Tools.
• Experience in working with SQL & NonSql databases such as MySQL, Mongo DB.
• Exposure to cloud technologies.
• Demonstrate the ability to work in a fast paced and hyper-growth environment where the requirements are constantly changing.
• Nice to have: Experience developing products containing machine learning use cases.
• Familiar with agile techniques like code reviews, pair programming, collective code ownership, clean code, TDD and refactoring.
Must have exp in Django. [ Mandatory]
- The ability to problem-solve and critically think.
- High level of knowledge of Python and the Django framework.
- Familiarity with event driven programming as well as the MVC.
- Good understanding of SQL databases.
- Good understanding of REST APIs.
.Net Core Developer
Experience 2+ Yrs in .Net Core/C# Responsibilities Responsible for designing and developing REST APIs using .Net Core Framework and C#. Create high-performance REST APIs for financial applications. Qualifications B.Tech/BE or MCA – May be relaxed in the case of experienced candidates.
Experience At least 2+ Years of experience as .Net Core Developer Proficient in .Net Core and C# Excellent knowledge in developing REST APIs and Entity Framework Knowledge and experience in writing SQL, SPs, and Triggers Sound knowledge in MVC frameworks and databases. Good project management skills. Good communications skills A good team player
Salary 2 -8 LPA
Job Description:
As a Software Development Engineer at Jumbotail you will-
§ Be a part of our initial core team to design and develop our marketplace platforms from scratch
§ Work on building scalable backend platform for customer & seller apps, brands platform, demand
generation platform, supply chain & logistics platform, credit platform and several cross platform
software components
§ Participate in the process to fundamentally change the food and grocery ecosystem in India, and
impact billions of people through technology, mobile, and data science
Requirements:
An ideal candidate for this role is someone who has-
§ BE/BTech degree in Computer Science from a top engineering school
§ 3-5 years of professional software development experience
§ Strong problem solving skills and strong command on object oriented design, data structures,
algorithms and other computer science fundamentals.
§ Strong coding skills – professional experience in developing production-quality software in Java.
§ Expertise in Web Services, Service-oriented architecture, Databases, NoSQL, Distributed systems,
Cloud Technologies.
§ Extreme software engineering skills to design and develop low-latency, high availability, internetscale
web services
§ Solid understanding of the full software development life cycle and software engineering best
practices
§ Ability to understand business requirements and translate them into technical requirements
§ Demonstrated ability to own software design and development end to end from requirements to
launch.
§ Ability to collaborate with cross-functional teams to define, design, and ship new features.
§ A startup mindset – An athlete who can run at breakneck speed of a startup, yet someone who can
bring method to madness through processes suitable for different stages of the company – early
stage prototyping and rapid experimentation, before product/market fit, after product/market fit,
and scaling.
§ Familiarity with Agile development, Scrums, continuous integration, and test driven development
processes
§ Data driven product development approach – strong focus on data backed engineering decisions.
§ Ability to develop products incrementally in fast iterations
§ Ability to do collaborative problem solving and design/build chaos resilient systems
§ Strong focus on software quality
§ Ability to mentor junior developers, and help build an excellent engineering team
§ Ability to be a talent magnet – attract great talent to join the core team.
We are looking for warriors who have the hunger to impact real lives and who can match our high
bar on Core Values that we live by.
If you-
§ can apply first principles thinking to solve problems
§ can envision a great future that you want create
§ have the fire in your belly to get out of your cube and do something about your vision and passion
§ want to work with some really smart people, and still raise the bar for all of us
§ can have fun and help your colleagues have fun doing all of the above..
Job Description:
- 3 - 4 years of hands-on Python programming & libraries like PyData, Pandas
- Exposure to Mongo DB
- Experience in writing Unit Test cases
- Expertise in writing medium/advanced SQL Database queries
- Strong Verbal/Written communication skills
- Ability to work with onsite counterpart teams
Primary Location: Pune
Description:
Responsibilities -
● The candidate is expected to lead one of the key business areas end to end. This is pure
hands on role but he/she may need to mentor junior person in the team.
● Requirement gathering with business and get this prioritized in the sprint cycle.
● Come up with Project Architecture design and get the same approved from Tech Review
committee.
● Ensure quality and timely delivery.
Technical and Professional Requirements-
Required Tech Skills
● Very Strong fundamental of OOPs programming
● Very Strong at Java fundamentals, Multithreading, Streams
● Good understanding of Data Structure
● Good knowledge of any distributed caching /computing framework/tools
● Good at SQL query/optimization
Nice To Have (willing to learn)
● AWS Lambda (Serverless), Redis, Kinesis , Big Data , Sparx , spring boot , NoSQL
database , React Js, JMS/SQS , AWS Cloud, nodejs, python
● Well versed with latest technology stack on server side programming
● Good to have business knowledge of Loan management.
We are seeking a lead Python Developer to lead the backend efforts and, in the process design, develop, and deploy its customer centric applications.
The person will have the opportunity to design and build an early stage, rapidly evolving platform from scratch and carry out these primary responsibilities -
Optimize components for maximum performance across multiple devices and browsers
Write performant REST APIs for both internal and external consumption
Build micro services and their deployment process
Work with problems of scale, leveraging technologies that are distributed in nature.
Perform code reviews
Required qualifications and must have skills
Excellent analytical and problem-solving skills
Proven-deep-expertise with Python programming (4+ years of hands-on experience in Python and backend development)
Building performant and scalable applications from scratch
Experience in working with frameworks like Django, Flask, etc.
Experience with building APIs and services using REST, SOAP, etc.
Experience with any RDBMS and strong SQL knowledge
Comfortable with Unix / Linux command line
Object-oriented concepts & design patterns
Sytem and database desinging skills
Nice to have Skills
Knowledge of other programming languages beyond Python
Familiarity with managing infrastructure on AWS/GCloud
Experience with working with/building data analytics pipleline
Familiarity with NoSQL databases
Good understanding of Docker and container platforms like Mesos and Kubernetes
Security-first architecture approach
Application benchmarking and optimization
Interpersonal Attributes
You are driven by the impact your work creates
You can answer the why behind any technological choice you make
You can work independently as well as part of a team
About Us
DataWeave provides Retailers and Brands with “Competitive Intelligence as a Service” that enables them to take
key decisions that impact their revenue. Powered by AI, we provide easily consumable and actionable
competitive intelligence by aggregating and analyzing billions of publicly available data points on the Web to
help businesses develop data-driven strategies and make smarter decisions.
Products@DataWeave
We, the Products team at DataWeave, build data products that provide timely insights that are readily
consumable and actionable, at scale. Our underpinnings are: scale, impact, engagement, and visibility. We help
businesses take data driven decisions everyday. We also give them insights for long term strategy. We are
focussed on creating value for our customers and help them succeed.
How we work
It's hard to tell what we love more, problems or solutions! Every day, we choose to address some of the hardest
data problems that there are. We are in the business of making sense of messy public data on the web. At
serious scale! Read more on Become a DataWeaver
What do we offer?
● Opportunity to work on some of the most compelling data products that we are building for online
retailers and brands.
● Ability to see the impact of your work and the value you are adding to our customers almost immediately.
● Opportunity to work on a variety of challenging problems and technologies to figure out what really
excites you.
● A culture of openness. Fun work environment. A flat hierarchy. Organization wide visibility. Flexible
working hours.
● Learning opportunities with courses, trainings, and tech conferences. Mentorship from seniors in the
team.
● Last but not the least, competitive salary packages and fast paced growth opportunities.
Role and Responsibilities
● Build a low latency serving layer that powers DataWeave's Dashboards, Reports, and Analytics
functionality
● Build robust RESTful APIs that serve data and insights to DataWeave and other products
● Design user interaction workflows on our products and integrating them with data APIs
● Help stabilize and scale our existing systems. Help design the next generation systems.
● Scale our back end data and analytics pipeline to handle increasingly large amounts of data.
● Work closely with the Head of Products and UX designers to understand the product vision and design
philosophy
● Lead/be a part of all major tech decisions. Bring in best practices. Mentor younger team members and
interns.
● Constantly think scale, think automation. Measure everything. Optimize proactively.
● Be a tech thought leader. Add passion and vibrance to the team. Push the envelope.
Skills and Requirements
● 4-7 years of experience building and scaling APIs and web applications.
● Experience building and managing large scale data/analytics systems.
● Have a strong grasp of CS fundamentals and excellent problem solving abilities. Have a good
understanding of software design principles and architectural best practices.
● Be passionate about writing code and have experience coding in multiple languages, including at least
one scripting language, preferably Python.
● Be able to argue convincingly why feature X of language Y rocks/sucks, or why a certain design decision
is right/wrong, and so on.
● Be a self-starter—someone who thrives in fast paced environments with minimal ‘management’.
● Have experience working with multiple storage and indexing technologies such as MySQL, Redis,
MongoDB, Cassandra, Elastic.
● Good knowledge (including internals) of messaging systems such as Kafka and RabbitMQ.
● Use the command line like a pro. Be proficient in Git and other essential software development tools.
● Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus.
● Exposure to one or more centralized logging, monitoring, and instrumentation tools, such as Kibana,
Graylog, StatsD, Datadog etc.
● Working knowledge of building websites and apps. Good understanding of integration complexities and
dependencies.
● Working knowledge linux server administration as well as the AWS ecosystem is desirable.
● It's a huge bonus if you have some personal projects (including open source contributions) that you work
on during your spare time. Show off some of your projects you have hosted on GitHub.