If you are an outstanding ETL Developer with a passion for technology and looking forward to being part of a great development organization, we would love to hear from you. We are offering technology consultancy services to our Fortune 500 customers with a primary focus on digital technologies. Our customers are looking for top-tier talents in the industry and willing to compensate based on your skill and expertise. The nature of our engagement is Contract in most cases. If you are looking for the next big step in your career, we are glad to partner with you.
Below is the job description for your review.
Extensive hands- on experience in designing and developing ETL packages using SSIS
Extensive experience in performance tuning of SSIS packages
In- depth knowledge of data warehousing concepts and ETL systems, relational databases like SQL Server 2012/ 2014.
Similar jobs
• 6+ years of data science experience.
• Demonstrated experience in leading programs.
• Prior experience in customer data platforms/finance domain is a plus.
• Demonstrated ability in developing and deploying data-driven products.
• Experience of working with large datasets and developing scalable algorithms.
• Hands-on experience of working with tech, product, and operation teams.
Technical Skills:
• Deep understanding and hands-on experience of Machine learning and Deep
learning algorithms. Good understanding of NLP and LLM concepts and fair
experience in developing NLU and NLG solutions.
• Experience with Keras/TensorFlow/PyTorch deep learning frameworks.
• Proficient in scripting languages (Python/Shell), SQL.
• Good knowledge of Statistics.
• Experience with big data, cloud, and MLOps.
Soft Skills:
• Strong analytical and problem-solving skills.
• Excellent presentation and communication skills.
• Ability to work independently and deal with ambiguity.
Continuous Learning:
• Stay up to date with emerging technologies.
Qualification.
A degree in Computer Science, Statistics, Applied Mathematics, Machine Learning, or any related field / B. Tech.
- Extensive exposure to at least one Business Intelligence Platform (if possible, QlikView/Qlik Sense) – if not Qlik, ETL tool knowledge, ex- Informatica/Talend
- At least 1 Data Query language – SQL/Python
- Experience in creating breakthrough visualizations
- Understanding of RDMS, Data Architecture/Schemas, Data Integrations, Data Models and Data Flows is a must
Your Day-to-Day
- Derive Insights and drive major strategic projects to improve Business Metrics and take responsibility for cost efficiency and Revenue management across the country
- Perform Market research, Post Mortem analyses on competitor expansion and Market Penetration patterns.
- Provide in-depth business analysis and data insights for internal stakeholders to help improve business. Derive and launch projects in order to reduce the gaps between targeted and projected business metrics
- Responsible for optimizing Carsome’s C2B and B2C customer acquisition and Dealer retention funnel. Work closely with Marketing and Tech teams to create, produce and implement creative digital marketing campaigns and drive CRM initiatives and strategies
- Analyse the Revenue flows and processes large datasets to gather process insights and propose process improvement ideas for Carsome across SE-Asia
- Lead commercial projects & process mapping, from conceptualization to completion, to build or re-engineer business models, tools and processes.
- Having experience in analyses and insights in dealing on Unit Economics, COGs and P&L will be preferred ,but not mandatory
- Use Business Intelligence and Data Science tools to answer the appropriate business problems using SQL, Tableau or Python.
- Coordinate with HQ Data Insights Team and manage internal stakeholders across departments to ensure the smooth delivery of strategic projects
- Work across different departments/functions (BI,DE, tech, pricing, finance, operations, marketing, CS,CX) and also on high impact projects and support business expansion initiatives
Your Know-Know
- At least a Bachelor's Degree in Accounting/Finance/Business or the equivalent.
- 3-5 years of experience in strategy / consulting / analytical / project management roles; experience in e-commerce, Start-ups or Unicorns(CARS24,OLA,SWIGGY,FLIPKART,OYO) or entrepreneur experience preferred + At Least 2 years of experience leading a team
- Top-notch academics from a Tier 1 college (IIM / IIT/ NIT)
- Must have SQL/PostgreSQL/Tableau Experience.
- Excellent Market Research, reporting and analytical skills, including carrying out weekly and monthly reporting
- Holds experience in working with Data/Business Intelligence Team
- Analytical mindset with ability to present data in a structured and informative way
- Enjoy a fast-paced environment and can align business objectives with product priorities
- Good to have : Financial modelling, Developing financial forecasts , development of Financial - strategic plan/framework
Company Description
UpSolve is a Gen AI and Vision AI startup that helps businesses solve their problems by building custom solutions that drive strategic business decisions. Whether your business is facing time constraints or a lack of resources, UpSolve can help. We build enterprise grade AI solutions with focus on increasing ROI.
Role Description
This is a full-time hybrid role for a Business Analyst located in Mumbai.
Please note: This is an onsite role and good communication skills are expected (oral + written)
Responsibilities
1. Understand existing system integrations for the client.
2. Map and identify gaps in existing systems.
3. Ideate, Advise and Implement AI Solutions to optimize business process.
4. Collaborate with multiple teams and stakeholders.
Qualifications
- MBA with focus on Business Analytics or Bachelor's degree in Computer Science or IT
- Minimum 4 Years of Experience
- Strong written, verbal and collaboration skills
- Immediate Joiner (Less than 5 days)
Work Location: Mumbai, Work from Office
About the role:
Hopscotch is looking for a passionate Data Engineer to join our team. You will work closely with other teams like data analytics, marketing, data science and individual product teams to specify, validate, prototype, scale, and deploy data pipelines features and data architecture.
Here’s what will be expected out of you:
➢ Ability to work in a fast-paced startup mindset. Should be able to manage all aspects of data extraction transfer and load activities.
➢ Develop data pipelines that make data available across platforms.
➢ Should be comfortable in executing ETL (Extract, Transform and Load) processes which include data ingestion, data cleaning and curation into a data warehouse, database, or data platform.
➢ Work on various aspects of the AI/ML ecosystem – data modeling, data and ML pipelines.
➢ Work closely with Devops and senior Architect to come up with scalable system and model architectures for enabling real-time and batch services.
What we want:
➢ 5+ years of experience as a data engineer or data scientist with a focus on data engineering and ETL jobs.
➢ Well versed with the concept of Data warehousing, Data Modelling and/or Data Analysis.
➢ Experience using & building pipelines and performing ETL with industry-standard best practices on Redshift (more than 2+ years).
➢ Ability to troubleshoot and solve performance issues with data ingestion, data processing & query execution on Redshift.
➢ Good understanding of orchestration tools like Airflow.
➢ Strong Python and SQL coding skills.
➢ Strong Experience in distributed systems like spark.
➢ Experience with AWS Data and ML Technologies (AWS Glue,MWAA, Data Pipeline,EMR,Athena, Redshift,Lambda etc).
➢ Solid hands on with various data extraction techniques like CDC or Time/batch based and the related tools (Debezium, AWS DMS, Kafka Connect, etc) for near real time and batch data extraction.
Note :
Product based companies, Ecommerce companies is added advantage
About us
SteelEye is the only regulatory compliance technology and data analytics firm that offers transaction reporting, record keeping, trade reconstruction, best execution and data insight in one comprehensive solution. The firm’s scalable secure data storage platform offers encryption at rest and in flight and best-in-class analytics to help financial firms meet regulatory obligations and gain competitive advantage.
The company has a highly experienced management team and a strong board, who have decades of technology and management experience and worked in senior positions at many leading international financial businesses. We are a young company that shares a commitment to learning, being smart, working hard and being honest in all we do and striving to do that better each day. We value all our colleagues equally and everyone should feel able to speak up, propose an idea, point out a mistake and feel safe, happy and be themselves at work.
Being part of a start-up can be equally exciting as it is challenging. You will be part of the SteelEye team not just because of your talent but also because of your entrepreneurial flare which we thrive on at SteelEye. This means we want you to be curious, contribute, ask questions and share ideas. We encourage you to get involved in helping shape our business. What you'll do
What you will do?
- Deliver plugins for our python based ETL pipelines.
- Deliver python services for provisioning and managing cloud infrastructure.
- Design, Develop, Unit Test, and Support code in production.
- Deal with challenges associated with large volumes of data.
- Manage expectations with internal stakeholders and context switch between multiple deliverables as priorities change.
- Thrive in an environment that uses AWS and Elasticsearch extensively.
- Keep abreast of technology and contribute to the evolution of the product.
- Champion best practices and provide mentorship.
What we're looking for
- Python 3.
- Python libraries used for data (such as pandas, numpy).
- AWS.
- Elasticsearch.
- Performance tuning.
- Object Oriented Design and Modelling.
- Delivering complex software, ideally in a FinTech setting.
- CI/CD tools.
- Knowledge of design patterns.
- Sharp analytical and problem-solving skills.
- Strong sense of ownership.
- Demonstrable desire to learn and grow.
- Excellent written and oral communication skills.
- Mature collaboration and mentoring abilities.
What will you get?
- This is an individual contributor role. So, if you are someone who loves to code and solve complex problems and build amazing products and not worry about anything else, this is the role for you.
- You will have the chance to learn from the best in the business who have worked across the world and are technology geeks.
- Company that always appreciates ownership and initiative. If you are someone who is full of ideas, this role is for you.
- Handling Survey Scripting Process through the use of survey software platform such as Toluna, QuestionPro, Decipher.
- Mining large & complex data sets using SQL, Hadoop, NoSQL or Spark.
- Delivering complex consumer data analysis through the use of software like R, Python, Excel and etc such as
- Working on Basic Statistical Analysis such as:T-Test &Correlation
- Performing more complex data analysis processes through Machine Learning technique such as:
- Classification
- Regression
- Clustering
- Text
- Analysis
- Neural Networking
- Creating an Interactive Dashboard Creation through the use of software like Tableau or any other software you are able to use.
- Working on Statistical and mathematical modelling, application of ML and AI algorithms
What you need to have:
- Bachelor or Master's degree in highly quantitative field (CS, machine learning, mathematics, statistics, economics) or equivalent experience.
- An opportunity for one, who is eager of proving his or her data analytical skills with one of the Biggest FMCG market player.
- Working closely with business stakeholders to define, strategize and execute crucial business problem statements which lie at the core of improvising current and future data-backed product offerings.
- Building and refining underwriting models for extending credit to sellers and API Partners in collaboration with the lending team
- Conceiving, planning and prioritizing data projects and manage timelines
- Building analytical systems and predictive models as a part of the agile ecosystem
- Testing performance of data-driven products participating in sprint-wise feature releases
- Managing a team of data scientists and data engineers to develop, train and test predictive models
- Managing collaboration with internal and external stakeholders
- Building data-centric culture from within, partnering with every team, learning deeply about business, working with highly experienced, sharp and insanely ambitious colleagues
What you need to have:
- B.Tech/ M.Tech/ MS/ PhD in Data Science / Computer Science, Statistics, Mathematics & Computation with a demonstrated skill-set in leading an Analytics and Data Science team from IIT, BITS Pilani, ISI
- 8+ years working in the Data Science and analytics domain with 3+ years of experience in leading a data science team to understand the projects to be prioritized, how the team strategy aligns with the organization mission;
- Deep understanding of credit risk landscape; should have built or maintained underwriting models for unsecured lending products
- Should have handled a leadership team in a tech startup preferably a fintech/ lending/ credit risk startup.
- We value entrepreneurship spirit: if you have had the experience of starting your own venture - that is an added advantage.
- Strategic thinker with agility and endurance
- Aware of the latest industry trends in Data Science and Analytics with respect to Fintech, Digital Transformations and Credit-lending domain
- Excellent command over communication is the key to manage multiple stakeholders like the leadership team, product teams, existing & new investors.
- Cloud Computing, Python, SQL, ML algorithms, Analytics and problem - solving mindset
- Knowledge and demonstrated skill-sets in AWS
- Insurance P&C and Specialty domain experience a plus
- Experience in a cloud-based architecture preferred, such as Databricks, Azure Data Lake, Azure Data Factory, etc.
- Strong understanding of ETL fundamentals and solutions. Should be proficient in writing advanced / complex SQL, expertise in performance tuning and optimization of SQL queries required.
- Strong experience in Python/PySpark and Spark SQL
- Experience in troubleshooting data issues, analyzing end to end data pipelines, and working with various teams in resolving issues and solving complex problems.
- Strong experience developing Spark applications using PySpark and SQL for data extraction, transformation, and aggregation from multiple formats for analyzing & transforming the data to uncover insights and actionable intelligence for internal and external use