
The Sr. Analytics Engineer would provide technical expertise in needs identification, data modeling, data movement, and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective.
Understands and leverages best-fit technologies (e.g., traditional star schema structures, cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges.
Provides data understanding and coordinates data-related activities with other data management groups such as master data management, data governance, and metadata management.
Actively participates with other consultants in problem-solving and approach development.
Responsibilities :
Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs.
Perform data analysis to validate data models and to confirm the ability to meet business needs.
Assist with and support setting the data architecture direction, ensuring data architecture deliverables are developed, ensuring compliance to standards and guidelines, implementing the data architecture, and supporting technical developers at a project or business unit level.
Coordinate and consult with the Data Architect, project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels.
Work closely with Business Analysts and Solution Architects to design the data model satisfying the business needs and adhering to Enterprise Architecture.
Coordinate with Data Architects, Program Managers and participate in recurring meetings.
Help and mentor team members to understand the data model and subject areas.
Ensure that the team adheres to best practices and guidelines.
Requirements :
- Strong working knowledge of at least 3 years of Spark, Java/Scala/Pyspark, Kafka, Git, Unix / Linux, and ETL pipeline designing.
- Experience with Spark optimization/tuning/resource allocations
- Excellent understanding of IN memory distributed computing frameworks like Spark and its parameter tuning, writing optimized workflow sequences.
- Experience of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., Redshift, Bigquery, Cassandra, etc).
- Familiarity with Docker, Kubernetes, Azure Data Lake/Blob storage, AWS S3, Google Cloud storage, etc.
- Have a deep understanding of the various stacks and components of the Big Data ecosystem.
- Hands-on experience with Python is a huge plus

Similar jobs
Python. Django Rest Framework experience would be great, but not essential! We prioritise
quality over quantity.
Postgres with InfluxDB, Oracle, MySQL and Redis experience a plus.
Docker, Kubernetes, Helm, OpenShift and associated tooling.
AWS, especially ECS, Lambda, RDS and DynamoDB. Performance and uptime are super
important to us.
The challenge of scaling a global, distributed API to 10,000+ requests per second.
We have SDKs in a bunch of languages, so the more polyglot you are the better.
If you like writing JS and React that would be awesome too.
Minimum 3+ Years of Core Java Programming with Collections Framework, Concurrent Programming, Multi-threading (Good knowledge in Executor service, Fork joins pool and other threading concepts)
· Good knowledge of the JVM with an understanding of performance and memory optimization.
· Extensive and expert programming experience in JAVA programming language (strong OO skills preferred).
· Excellent knowledge on collections like, Array List, Vector, LinkedList, HashMap, Hash Table, HashSet is mandate.
· Exercised exemplary development practices including design specification, coding standards, unit testing, and code-reviews.
· Expert level understanding of Object-Oriented Concepts and Data Structures
· Good experience in Database (Sybase, Oracle or SQL Server) like indexing (clustered, non-clustered), hashing, segmenting, data types like clob / blob, views (materialized), replication, constraints, functions, triggers, procedures etc.
Responsibilities
· Develop Python-based APIs using FastAPI and Flask frameworks.
· Develop Python-based Automation scripts and Libraries.
· Develop Front End Components using VueJS and ReactJS.
· Writing and modifying Docker files for the Back-End and Front-End Components.
· Integrate CI/CD pipelines for Automation and Code quality checks.
· Writing complex ORM mappings using SQLAlchemy.
Required Skills:
· Strong experience in Python development in a full stack environment is a requirement, including NodeJS, VueJS/Vuex, Flask, etc.
· Experience with SQLAchemy or similar ORM frameworks.
· Experience working with Geolocation APIs (e.g., Google Maps, Mapbox).
· Experience using Elasticsearch and Airflow is a plus.
· Strong knowledge of SQL, comfortable working with MySQL and/or PostgreSQL databases.
· Understand concepts of Data Modeling.
· Experience with REST.
· Experience with Git, GitFlow, and code review process.
· Good understanding of basic UI and UX principles.
· Project excellent problem-solving and communication skills.
Job Description
Title - Lead Snowflake Developer
Location - Chennai/Hyderabad/Bangalore
Role - Fulltime
Notice Period/Availability - Immediate
Years of Experience - 6+
Job Description:
- Overall 6 years of experience in IT/Software development
- Minimum 3 years of experience working with Snowflake.
- Designing, implementing and testing cloud computing solutions using Snowflake technology.
- Creating, monitoring and optimization of ETL/ELT processes.
- Migrating solutions from on-premises to public cloud platforms.
- Experience in SQL language and data warehousing concepts.
- Experience in Cloud technologies: AWS, Azure or GCP.
Quantsapp is India's first Option Trading Analytics platform on mobile. With
ever-growing users,
it makes us one of the fastest growing platform for options trading in India.
Quantsapp wants to accelerate its growth even more and capture new
countries which requires the development team to grow.
At Qauntsapp we are looking for a dynamic team mate to take up a role of
Server side development to support the brain behind the application.
Job Summary :
- You will be responsible for developing new logics / products / features as
described by the business / research team.
- An ideal candidate should be strong in mathematical processes like
optimizations, matrix algebra, differential equation, simulation process etc.
and should also possess decent hands-on experience on python, sql server and
preferably Aws. IIT graduation is a plus.
Responsibilities :
- Create algorithms from scratch
- Create products and backend API's as described by the business team
- Back-test and create hypothesis as desired by the research team
- Code the backend of logics for consumption by the UI team
- Deploy websockets, rest api's & dynamic tcp, udp based data flow
- Deployment and maintenance of codes with version control
Requirements :
- Should possess a good knowledge of advanced computing and mathematical
process
- Strong hands-on on Python and optionally Matlab
- Knowledge of databases like Sql & No Sql
- Ability to work with tight time lines
- In depth knowledge and good hands on experience on Pandas and Numpy
- Knowledge of Option Markets is a plus
- Excellent organizational and multitasking ability
- Experience on AWS Cloud is a plus
Best Opportunity for Experienced Professionals
Role/Position - SQL DB Dev Analyst
Job Description
Proven work experience of 8+ years as a database developer.
In depth understanding of data management (e.g. permissions, recovery, security and monitoring)
Hands on experience with MS SQL Server.
Developed and maintained various T-SQL stored procedures for different applications, reports and agent jobs.
Worked with DB administrator to provide overall performance tuning of Database System which include applying Indexes and refining stored procedure.
Completed the conversion of various legacy DB applications.
Design, code, test and debug business logic for software applications using T-SQL per defined technical specifications.
Excellent analytical and organization skills.
Knowledge of software development and user interface web applications.
An ability to understand front-end user requirements and problem-solving attitude.
Should have excellent analytical and communication skills.
Qualifications
· B.E. / B.Tech. /MSC in Computer Science or IT.
Additional information Responsibilities:
· Maximize system efficiency by regularly tuning and optimizing stored procedures.
· Able to perform data analysis to clean up data, creating queries and automation routines.
· Translate use cases into functional applications
· Ensure the best possible performance, quality and responsiveness of applications.
· Help maintain code quality
· Able to work well in a team setting.
Notice Period - 15 days
Salary Package - Open
Day to Day Responsibilities:
1. Develop Django based RESTful APIs and WebSockets
2. Manage and mentor the junior and intern developers to deliver the project requirements in the given timeline
3. Carry out Database Modelling and Designing
4. Make DFDs, ER diagrams, etc.
5. Integrate Python APIs/endpoints to other Python outputs (JSON, etc.)
6. Work with clean code writing practices and structure the code for collaborated development
7. Analyze different use-cases and coming up with creative solutions
8. Build reusable components and back-end libraries for future use
9. Participate in daily scrums
10. Work on responsive web development
11. Work with the team to manage, optimize, and customize multiple web applications
12. Learn and work using new technologies
13. Work on being involved and participate in the overall application lifecycle
14. Work with a focus on coding and debugging
15. Collaborate with front-end developers









