50+ Python Jobs in Mumbai | Python Job openings in Mumbai
Apply to 50+ Python Jobs in Mumbai on CutShort.io. Explore the latest Python Job opportunities across top companies like Google, Amazon & Adobe.
Job Title - Cloud Fullstack Engineer
Experience Required - 5 Years
Location - Mumbai
Immediate joiners are preferred.
About the Job
As a Cloud Fullstack Engineer you will design, develop, and maintain end-to-end solutions for cloud-based applications. The Cloud Fullstack Engineer will be responsible for building both frontend and backend components, integrating them seamlessly, and ensuring they work efficiently within a cloud infrastructure.
What You’ll Be Doing
- Frontend Development
- Design and implement user-friendly and responsive web interfaces using modern frontend technologies (e.g., React, Angular, Vue.js).
- Ensure cross-browser compatibility and mobile responsiveness of web applications.
- Collaborate with UX/UI designers to translate design specifications into functional and visually appealing interfaces.
- Backend Development
- Develop scalable and high-performance backend services and APIs to support frontend functionalities.
- Design and manage cloud-based databases and data storage solutions.
- Implement authentication, authorization, and security best practices in backend services.
- Cloud Integration
- Build and deploy cloud-native applications using platforms such as AWS, Google Cloud Platform (GCP), or Azure.
- Leverage cloud services for computing, storage, and networking to enhance application performance and scalability.
- Implement and manage CI/CD pipelines for seamless deployment of applications and updates.
- End-to-End Solution Development
- Architect and develop fullstack applications that integrate frontend and backend components efficiently.
- Ensure data flow between frontend and backend is seamless and secure.
- Troubleshoot and resolve issues across the stack, from UI bugs to backend performance problems.
- Performance Optimization
- Monitor and optimize application performance, including frontend load times and backend response times.
- Implement caching strategies, load balancing, and other performance-enhancing techniques.
- Conduct performance testing and address bottlenecks and scalability issues.
- Security and Compliance
- Implement security best practices for both frontend and backend components to protect against vulnerabilities.
- Ensure compliance with relevant data protection regulations and industry standards.
- Conduct regular security assessments and audits to maintain application integrity.
- Collaboration and Communication
- Work closely with cross-functional teams, including product managers, designers, and other engineers, to deliver high-quality solutions.
- Participate in code reviews, technical discussions, and project planning sessions.
- Document code, processes, and architecture to facilitate knowledge sharing and maintainability.
- Continuous Improvement
- Stay updated with the latest trends and advancements in frontend and backend development, as well as cloud technologies.
- Contribute to the development of best practices and standards for fullstack development within the team.
- Participate in knowledge-sharing sessions and provide mentorship to junior engineers.
What We Need To See
- Strong experience in both frontend and backend development, as well as expertise in cloud technologies and services.
- Experience in fullstack development, with a strong focus on both frontend and backend technologies.
- Proven experience with cloud platforms (AWS, GCP, Azure) and cloud-native application development.
- Experience with modern frontend frameworks (e.g., React, Angular, Vue.js) and backend technologies (e.g., Node.js, Java, Python).
- Technical Expertise:
1. FrontEnd
- Handon Experience with HTML5, CSS, JavaScript, ReactJs, Next.js, redux, JQuery
2. Proficiency in Backend Development
- Strong experience with backend programming languages such as Node.js, Python
- Expertise in working with frameworks such as NestJS, Express.js, or Django.
3. Microservices Architecture
- Experience designing and implementing microservices architectures.
- Knowledge of service discovery, API gateways, and distributed tracing.
4. API Development
- Proficiency in designing, building, and maintaining RESTful and GraphQL APIs.
- Experience with API security, rate limiting, and authentication mechanisms (e.g., JWT, OAuth).
5. Database Management
- Strong knowledge of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g. MongoDB).
- Experience in database schema design, optimization, and management.
6. Cloud Services
- Hands-on experience with cloud platforms such as Azure,AWS or Google Cloud.
- Security: Knowledge of security best practices and experience implementing secure coding practices.
- Soft Skills:
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills, with the ability to work effectively in a team environment.
- Ability to manage multiple priorities and work in a fast-paced, dynamic environment.
Building the machine learning production (or MLOps) is the biggest challenge most large companies currently have in making the transition to becoming an AI-driven organization. This position is an opportunity for an experienced, server-side developer to build expertise in this exciting new frontier. You will be part of a team deploying state-of-the-art AI solutions for Fractal clients.
Responsibilities
As MLOps Engineer, you will work collaboratively with Data Scientists and Data engineers to deploy and operate advanced analytics machine learning models. You’ll help automate and streamline Model development and Model operations. You’ll build and maintain tools for deployment, monitoring, and operations. You’ll also troubleshoot and resolve issues in development, testing, and production environments.
- Enable Model tracking, model experimentation, Model automation
- Develop ML pipelines to support
- Develop MLOps components in Machine learning development life cycle using Model Repository (either of): MLFlow, Kubeflow Model Registry
- Develop MLOps components in Machine learning development life cycle using Machine Learning Services (either of): Kubeflow, DataRobot, HopsWorks, Dataiku or any relevant ML E2E PaaS/SaaS
- Work across all phases of Model development life cycle to build MLOPS components
- Build the knowledge base required to deliver increasingly complex MLOPS projects on Azure
- Be an integral part of client business development and delivery engagements across multiple domains
Required Qualifications
- 3-5 years experience building production-quality software.
- B.E/B.Tech/M.Tech in Computer Science or related technical degree OR Equivalent
- Strong experience in System Integration, Application Development or Data Warehouse projects across technologies used in the enterprise space
- Knowledge of MLOps, machine learning and docker
- Object-oriented languages (e.g. Python, PySpark, Java, C#, C++)
- CI/CD experience( i.e. Jenkins, Git hub action,
- Database programming using any flavors of SQL
- Knowledge of Git for Source code management
- Ability to collaborate effectively with highly technical resources in a fast-paced environment
- Ability to solve complex challenges/problems and rapidly deliver innovative solutions
- Foundational Knowledge of Cloud Computing on Azure
- Hunger and passion for learning new skills
Building the machine learning production System(or MLOps) is the biggest challenge most large companies currently have in making the transition to becoming an AI-driven organization. This position is an opportunity for an experienced, server-side developer to build expertise in this exciting new frontier. You will be part of a team deploying state-ofthe-art AI solutions for Fractal clients.
Responsibilities
As MLOps Engineer, you will work collaboratively with Data Scientists and Data engineers to deploy and operate advanced analytics machine learning models. You’ll help automate and streamline Model development and Model operations. You’ll build and maintain tools for deployment, monitoring, and operations. You’ll also troubleshoot and resolve issues in development, testing, and production environments.
- Enable Model tracking, model experimentation, Model automation
- Develop scalable ML pipelines
- Develop MLOps components in Machine learning development life cycle using Model Repository (either of): MLFlow, Kubeflow Model Registry
- Machine Learning Services (either of): Kubeflow, DataRobot, HopsWorks, Dataiku or any relevant ML E2E PaaS/SaaS
- Work across all phases of Model development life cycle to build MLOPS components
- Build the knowledge base required to deliver increasingly complex MLOPS projects on Azure
- Be an integral part of client business development and delivery engagements across multiple domains
Required Qualifications
- 5.5-9 years experience building production-quality software
- B.E/B.Tech/M.Tech in Computer Science or related technical degree OR equivalent
- Strong experience in System Integration, Application Development or Datawarehouse projects across technologies used in the enterprise space
- Expertise in MLOps, machine learning and docker
- Object-oriented languages (e.g. Python, PySpark, Java, C#, C++)
- Experience developing CI/CD components for production ready ML pipeline.
- Database programming using any flavors of SQL
- Knowledge of Git for Source code management
- Ability to collaborate effectively with highly technical resources in a fast-paced environment
- Ability to solve complex challenges/problems and rapidly deliver innovative solutions
- Team handling, problem solving, project management and communication skills & creative thinking
- Foundational Knowledge of Cloud Computing on Azure
- Hunger and passion for learning new skills
Responsibilities
- Design and implement advanced solutions utilizing Large Language Models (LLMs).
- Demonstrate self-driven initiative by taking ownership and creating end-to-end solutions.
- Conduct research and stay informed about the latest developments in generative AI and LLMs.
- Develop and maintain code libraries, tools, and frameworks to support generative AI development.
- Participate in code reviews and contribute to maintaining high code quality standards.
- Engage in the entire software development lifecycle, from design and testing to deployment and maintenance.
- Collaborate closely with cross-functional teams to align messaging, contribute to roadmaps, and integrate software into different repositories for core system compatibility.
- Possess strong analytical and problem-solving skills.
- Demonstrate excellent communication skills and the ability to work effectively in a team environment.
Primary Skills
- Generative AI: Proficiency with SaaS LLMs, including Lang chain, llama index, vector databases, Prompt engineering (COT, TOT, ReAct, agents). Experience with Azure OpenAI, Google Vertex AI, AWS Bedrock for text/audio/image/video modalities.
- Familiarity with Open-source LLMs, including tools like TensorFlow/Pytorch and Huggingface. Techniques such as quantization, LLM finetuning using PEFT, RLHF, data annotation workflow, and GPU utilization.
- Cloud: Hands-on experience with cloud platforms such as Azure, AWS, and GCP. Cloud certification is preferred.
- Application Development: Proficiency in Python, Docker, FastAPI/Django/Flask, and Git.
- Natural Language Processing (NLP): Hands-on experience in use case classification, topic modeling, Q&A and chatbots, search, Document AI, summarization, and content generation.
- Computer Vision and Audio: Hands-on experience in image classification, object detection, segmentation, image generation, audio, and video analysis.
Company: CorpCare
Title: Lead Engineer (Full stack developer)
Location: Mumbai (work from office)
CTC: Commensurate with experience
About Us:
CorpCare is India’s first all-in-one corporate funds and assets management platform. We offer a single-window solution for corporates, family offices, and HNIs. We assist corporates in formulating and managing treasury management policies and conducting reviews with investment committees and the board.
Job Summary:
The Lead Engineer will be responsible for overseeing the development, implementation, and management of our corporate funds and assets management platform. This role demands a deep understanding of the broking industry/Financial services industry, software engineering, and product management. The ideal candidate will have a robust background in engineering leadership, a proven track record of delivering scalable technology solutions, and strong product knowledge.
Key Responsibilities:
- Engineering Strategy and Vision:
- Develop and communicate a clear engineering vision and strategy aligned with our broking and funds management platform.
- Conduct market research and technical analysis to identify trends, opportunities, and customer needs within the broking industry.
- Define and prioritize the engineering roadmap, ensuring alignment with business goals and customer requirements.
- Lead cross-functional engineering teams (software development, QA, DevOps, etc.) to deliver high-quality products on time and within budget.
- Oversee the entire software development lifecycle, from planning and architecture to development and deployment, ensuring robust and scalable solutions.
- Write detailed technical specifications and guide the engineering teams to ensure clarity and successful execution.
- Leverage your understanding of the broking industry to guide product development and engineering efforts.
- Collaborate with product managers to incorporate industry-specific requirements and ensure the platform meets the needs of brokers, traders, and financial institutions.
- Stay updated with regulatory changes, market trends, and technological advancements within the broking sector.
- Mentor and lead a high-performing engineering team, fostering a culture of innovation, collaboration, and continuous improvement.
- Recruit, train, and retain top engineering talent to build a world-class development team.
- Conduct regular performance reviews and provide constructive feedback to team members.
- Define and track key performance indicators (KPIs) for engineering projects to ensure successful delivery and performance.
- Analyze system performance, user data, and platform metrics to identify areas for improvement and optimization.
- Prepare and present engineering performance reports to senior management and stakeholders.
- Work closely with product managers, sales, marketing, and customer support teams to align engineering efforts with overall business objectives.
- Provide technical guidance and support to sales teams to help them understand the platform's capabilities and competitive advantages.
- Engage with customers, partners, and stakeholders to gather feedback, understand their needs, and validate engineering solutions.
Requirements:
- BE /B. Tech - Computer Science from a top engineering college
- MBA a plus, not required
- 5+ years of experience in software engineering, with at least 2+ years in a leadership role.
- Strong understanding of the broking industry and financial services industry.
- Proven track record of successfully managing and delivering complex software products.
- Excellent communication, presentation, and interpersonal skills.
- Strong analytical and problem-solving abilities.
- Experience with Agile/Scrum methodologies.
- Deep understanding of software architecture, cloud computing, and modern development practices.
Technical Expertise:
- Front-End: React, Next.js, JavaScript, HTML5, CSS3
- Back-End: Node.js, Express.js, RESTful APIs
- Database: MySQL, PostgreSQL, MongoDB
- DevOps: Docker, Kubernetes, AWS (EC2, S3, RDS), CI/CD pipelines
- Version Control: Git, GitHub/GitLab
- Other: TypeScript, Webpack, Babel, ESLint, Redux
Preferred Qualifications:
- Experience in the broking or financial services industry.
- Familiarity with data analytics tools and methodologies.
- Knowledge of user experience (UX) design principles.
- Experience with trading platforms or financial technology products.
This role is ideal for someone who combines strong technical expertise with a deep understanding of the broking industry and a passion for delivering high-impact software solutions.
Job Title: Data Analyst Associate – Data Management
Location: Mumbai, India
Company: Wissen Technology
About Us:
Wissen Technology is a leading technology and consulting firm known for its strong commitment to excellence, innovation, and customer satisfaction. We are passionate about harnessing data to drive smart business decisions and deliver impactful insights. Join us in shaping the future of data management.
Role Overview:
We are looking for a meticulous and proactive Data Analyst Associate - Data Management to join our team. This role requires a keen attention to detail, solid organizational skills, and a foundational knowledge of the asset management industry. You will support data analysis, management, and reporting initiatives, focusing on ensuring accuracy, efficiency, and improved decision-making processes.
Key Responsibilities:
- Data Management: Support data integrity and accuracy across various data systems; handle data extraction, transformation, and loading (ETL) processes.
- Analysis & Reporting: Generate insightful reports and conduct data analysis, utilizing Microsoft Excel and Business Objects; provide data support for decision-making processes.
- Collaboration: Engage with global business areas, provide data-related support, and actively participate in relevant meetings.
- Documentation & Communication: Document data management procedures, contribute to data governance practices, and effectively communicate findings with team members and stakeholders.
- Process Improvement: Identify and implement improvements in data operations to enhance efficiency and effectiveness.
- Technical Proficiency: Utilize data management tools and develop insights using BI reporting tools, including Tableau or Power BI; familiarity with Python is a plus.
Qualifications:
- Education: BTech / BE / MCA or any related field
- Industry Knowledge: Basic understanding of the asset management industry and data management practices.
- Technical Skills: Proficiency in Microsoft Excel, with foundational knowledge of Microsoft Access and Business Objects. Knowledge of BI reporting tools (Tableau, Power BI) is preferred, and Python skills are a plus.
- Communication Skills: Strong written and verbal communication abilities, with a focus on customer service and responsiveness.
- Organizational Skills: Strong multitasking, prioritization, and time management skills; attention to detail is essential.
- Project Management: Ability to coordinate and manage project tasks effectively, meeting deadlines with minimal supervision.
Why Join Wissen Technology?
- Opportunity to be part of a growing team focused on data-driven innovation and quality.
- Exposure to global clients and complex data management projects.
- Competitive benefits, including health coverage, paid time off, and a collaborative work environment.
We look forward to welcoming a detail-oriented and driven Data Analyst Associate to our team!
WHO WE ARE:
TIFIN is a fintech company backed by industry leaders including JP Morgan, Morningstar, Broadridge and Hamilton Lane.
We build engaging experiences through powerful AI and personalization. We leverage the combined power of investment intelligence, data science, and technology to make investing a more engaging experience and a more powerful driver of financial wellbeing.
At TIFIN, design and behavioral thinking enables engaging customer centered experiences along with software and application programming interfaces (APIs). We use investment science and intelligence to build algorithmic engines inside the software and APIs to enable better investor outcomes.
We hope to change the world of wealth in ways that personalized delivery has changed the world of movies, music and more. In a world where every individual is unique, we believe in the power of AI-based personalization to match individuals to financial advice and investments is necessary to driving wealth goals
OUR VALUES:
- Shared Understanding through Listening and Speaking the Truth. We communicate with radical candor, precision, and compassion to create a shared understanding. We challenge, but once a decision is made, commit fully. We listen attentively, speak candidly.
- Teamwork for Team Win. We believe in win together, learn together. We fly in formation. We cover each by other’s backs. We inspire each other with our energy and attitude.
- Make Magic for our Users. We center around the voice of the customer. With deep empathy for our clients, we create technology that transforms investor experiences.
- Grow at the Edge. We are driven by personal growth. We get out of our comfort zone and keep egos aside to find our genius zones. We strive to be the best we can possibly be. No excuses.
- Innovate with Creative Solutions. We believe that disruptive innovation begins with curiosity and creativity. We challenge the status quo and problem solve to find new answers.
WHAT YOU'LL BE DOING:
As part of TIFIN’s technology division, you will Automation QA role, ensure your dedication towards commitment to quality and attention to detail is the single most important aspect of this job role. This position requires you to be a dynamic individual who can not only manage a team of skilled automation engineers but also work as an individual contributor within the same team.
- Deliver all releases on time while ensuring high levels of automation coverage
- Work with QA Team to understand the product requirements and be able to provide automation testing estimates.
- Utilise experience (technical, functional) to create automation test strategies and execution plans to be able to deliver releases as per business needs.
- Create and execute automation test plans to allow for in-sprint automation.
- Review automation test artifacts created by team members.
- Work with development in an agile delivery methodology to capture defects early in the lifecycle by regular execution of regression automation suit.
- Work with automation engineers to get best ROI by identifying automation candidates.
Skills required
- 5+ years of experience in automation testing
- Must have 1+ years of experience working with test frameworks including Selenium and/or Cypress.
- Hands on expertise in various automation framework (bdd, data driven, keyword, hybrid).
- Must be capable of understanding test requirements, planning and creating test suites.
- Strong hold on programming/scripting language (Java/Python/JavaScript)
- Experience in services/API testing and automation (Jmeter/Any tool)
- Good exposure to SQL and Unix
- Proven experience working within an Agile framework
- Strong analytical and troubleshooting skills
- Solid experience on Test management and Version control softwares (Jira/Git/Bitbucket/Any other)
- Excellent communication (verbal and written) skills.
- WHO ARE YOU:
- Experience with Mobile automation using Appium or similar tools.
- Exposure to Continuous Integration & Continuous Deployment.
- Strong team and task management skills
- Exposure in Performance and Security testing
Job Location : Mumbai
COMPENSATION AND BENEFITS PACKAGE:
Competitive and commensurate to experience + discretionary annual bonus + ESOPs
About the Tifin Group: The Tifin group combines expertise in finance, technology, entrepreneurship and investing to start and help build a portfolio of brands and companies in areas of investments, wealth management and asset management.
TIFIN companies are centred around the user and emphasize design innovation to build operating systems. We focus on simplifying and democratizing financial science to make it more holistic and integral to users’ lives.
Industry
Artificial Intelligence Resercher (Computer vision)
Responsibility
• Work on Various SOTA Computer Vision Models, Dataset Augmentation & Dataset Generation
Techniques that help improve model accuracy & precision.
• Work on development & improvement of End-to-End Pipeline use cases running at scale.
• Programming skills with multi-threaded GPU CUDA computing and API Solutions.
• Proficient with Training of Detection, Classification & Segmentation Models with TensorFlow,
Pytorch, MX Net etc
Required Skills
• Strong development skills required in Python and C++.
• Ability to architect a solution based on given requirements and convert the business requirements into a technical computer vision problem statement.
• Ability to work in a fast-paced environment and coordinate across different parts of different projects.
• Bringing in the technical expertise around the implementation of best coding standards and
practices across the team.
• Extensive experience of working on edge devices like Jetson Nano, Raspberry Pi and other GPU powered low computational devices.
• Experience with using Docker, Nvidia Docker, Nvidia NGC containers for Computer Vision Deep
Learning
• Experience with Scalable Cloud Deployment Architecture for Video Analytics(Involving Kubernetes
and or Kafka)
• Good experience with any of one cloud technologies like AWS, Azure and Google Cloud.
• Experience in working with Model Optimisation for Nvidia Hardware (Tensors Conversion of both TensorFlow & Pytorch models.
• Proficient understanding of code versioning tools, such as Git.
• Proficient in Data Structures & Algorithms.
• Well versed in software design paradigms and good development practices.
• Experience with Scalable Cloud Deployment Architecture for Video Analytics(Involving Kubernetes
and or Kafka).
Job Description
We are looking for a talented Java Developer to work in abroad countries. You will be responsible for developing high-quality software solutions, working on both server-side components and integrations, and ensuring optimal performance and scalability.
Preferred Qualifications
- Experience with microservices architecture.
- Knowledge of cloud platforms (AWS, Azure).
- Familiarity with Agile/Scrum methodologies.
- Understanding of front-end technologies (HTML, CSS, JavaScript) is a plus.
Requirment Details
Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent experience).
Proven experience as a Java Developer or similar role.
Strong knowledge of Java programming language and its frameworks (Spring, Hibernate).
Experience with relational databases (e.g., MySQL, PostgreSQL) and ORM tools.
Familiarity with RESTful APIs and web services.
Understanding of version control systems (e.g., Git).
Solid understanding of object-oriented programming (OOP) principles.
Strong problem-solving skills and attention to detail.
About Databook
Databook is the world’s first AI-powered enterprise customer intelligence platform, founded in 2017 to empower enterprise sales teams with a distinct advantage. Leading companies like Microsoft, Salesforce, and Databricks rely on Databook to enhance customer engagement and accelerate revenue acquisition. Backed by Bessemer Ventures, DFJ Growth, M12 (Microsoft’s Venture fund), Salesforce Ventures, and Threshold Ventures, we operate as a customer-focused, innovative organization headquartered in Palo Alto, CA, with a global distributed team.
About Our Technology Team
The Engineering team at Databook brings together collaborative and technically passionate individuals to deliver innovative customer intelligence solutions. Led by former Google and Salesforce engineers, this group explores the full engineering lifecycle, driving impactful outcomes and offering opportunities for leadership and growth in a hyper-growth context.
The Opportunity
We're seeking a proactive and skilled Platform Engineer to enhance the reliability, scalability, and performance of our platform. This role offers the chance to collaborate closely with cross-functional teams, integrate new technologies, and advance our DevOps and SRE practices. If you're passionate about driving excellence, building robust systems, and contributing to the evolution of an AI-driven platform, join our dynamic team!
Responsibilities
- Promote best practices and standards across engineering teams to ensure platform reliability and performance.
- Collaborate with product management and engineering to enhance platform scalability and align with business goals.
- Develop and optimize backend systems and infrastructure to support platform growth.
- Implement and enhance CI/CD pipelines, automation, monitoring, and alerting systems.
- Document system performance, incidents, and resolutions, producing detailed technical reports.
- Formulate backend architecture plans and provide guidance on deployment strategies and reliability improvements.
- Participate in an on-call rotation to ensure 24/7 platform reliability and rapid incident response.
Qualifications
- 5+ years of experience in Platform or Infrastructure Engineering, DevOps, SRE, or similar roles.
- Strong backend development experience using Python, JavaScript/Typescript.
- Solid understanding of API design and implementation.
- Proficiency in SQL.
- Experience with CI/CD tools like Jenkins, GitLab CI, CircleCI.
- Hands-on experience with IaC tools such as Terraform, CloudFormation, Ansible.
- Familiarity with monitoring and observability tools like Datadog, Splunk, New Relic, Prometheus.
- Strong analytical and problem-solving skills with a focus on long-term solutions.
- Excellent communication skills for collaboration with technical and non-technical stakeholders.
- Ability to thrive in a fast-paced environment and manage multiple priorities.
Working Arrangements
This position offers a hybrid work mode, combining remote and in-office work as mutually agreed upon.
Ideal Candidates Will Also Have
- Interest or experience in Machine Learning and Generative AI.
- Exposure to performance, load, and stress testing frameworks.
- Familiarity with security best practices and tools in cloud environments.
Join Us and Enjoy These Perks!
- Competitive salary with bonus
- Medical insurance coverage
- Generous leave and public holidays
- Employee referral bonus program
- Annual learning stipend for professional development
- Complimentary subscription to Masterclass
About the company
DCB Bank is a new generation private sector bank with 442 branches across India.It is a scheduled commercial bank regulated by the Reserve Bank of India. DCB Bank’s business segments are Retail banking, Micro SME, SME, mid-Corporate, Agriculture, Government, Public Sector, Indian Banks, Co-operative Banks and Non-Banking Finance Companies.
Job Description
Department: Risk Analytics
CTC: Max 18 Lacs
Grade: Sr Manager/AVP
Experience: Min 4 years of relevant experience
We are looking for a Data Scientist to join our growing team of Data Science experts and manage the processes and people responsible for accurate data collection, processing, modelling, analysis, implementation, and maintenance.
Responsibilities
- Understand, monitor and maintain existing financial scorecards (ML Based) and make changes to the model when required.
- Perform Statistical analysis in R and assist IT team with deployment of ML model and analytical frameworks in Python.
- Should be able to handle multiple tasks and must know how to prioritize the work.
- Lead cross-functional projects using advanced data modelling and analysis techniques to discover insights that will guide strategic decisions and uncover optimization opportunities.
- Develop clear, concise and actionable solutions and recommendations for client’s business needs and actively explore client’s business and formulate solutions/ideas which can help client in terms of efficient cost cutting or in achieving growth/revenue/profitability targets faster.
- Build, develop and maintain data models, reporting systems, data automation systems, dashboards and performance metrics support that support key business decisions.
- Design and build technical processes to address business issues.
- Oversee the design and delivery of reports and insights that analyse business functions and key operations and performance metrics.
- Manage and optimize processes for data intake, validation, mining, and engineering as well as modelling, visualization, and communication deliverables.
- Communicate results and business impacts of insight initiatives to the Management of the company.
Requirements
- Industry knowledge
- 4 years or more of experience in financial services industry particularly retail credit industry is a must.
- Candidate should have either worked in banking sector (banks/ HFC/ NBFC) or consulting organizations serving these clients.
- Experience in credit risk model building such as application scorecards, behaviour scorecards, and/ or collection scorecards.
- Experience in portfolio monitoring, model monitoring, model calibration
- Knowledge of ECL/ Basel preferred.
- Educational qualification: Advanced degree in finance, mathematics, econometrics, or engineering.
- Technical knowledge: Strong data handling skills in databases such as SQL and Hadoop. Knowledge with data visualization tools, such as SAS VI/Tableau/PowerBI is preferred.
- Expertise in either R or Python; SAS knowledge will be plus.
Soft skills:
- Ability to quickly adapt to the analytical tools and development approaches used within DCB Bank
- Ability to multi-task good communication and team working skills.
- Ability to manage day-to-day written and verbal communication with relevant stakeholders.
- Ability to think strategically and make changes to data when required.
at Wissen Technology
Job Requirements:
Intermediate Linux Knowledge
- Experience with shell scripting
- Familiarity with Linux commands such as grep, awk, sed
- Required
Advanced Python Scripting Knowledge
- Strong expertise in Python
- Required
Ruby
- Nice to have
Basic Knowledge of Network Protocols
- Understanding of TCP/UDP, Multicast/Unicast
- Required
Packet Captures
- Experience with tools like Wireshark, tcpdump, tshark
- Nice to have
High-Performance Messaging Libraries
- Familiarity with tools like Tibco, 29West, LBM, Aeron
- Nice to have
Key Responsibilities:
• Install, configure, and maintain Hadoop clusters.
• Monitor cluster performance and ensure high availability.
• Manage Hadoop ecosystem components (HDFS, YARN, Ozone, Spark, Kudu, Hive).
• Perform routine cluster maintenance and troubleshooting.
• Implement and manage security and data governance.
• Monitor systems health and optimize performance.
• Collaborate with cross-functional teams to support big data applications.
• Perform Linux administration tasks and manage system configurations.
• Ensure data integrity and backup procedures.
Company: CorpCare
Title: Lead Engineer (Full stack developer)
Location: Mumbai (work from office)
CTC: Commensurate with experience
About Us:
CorpCare is India’s first all-in-one corporate funds and assets management platform. We offer a single-window solution for corporates, family offices, and HNIs. We assist corporates in formulating and managing treasury management policies and conducting reviews with investment committees and the board.
Job Summary:
The Lead Engineer will be responsible for overseeing the development, implementation, and management of our corporate funds and assets management platform. This role demands a deep understanding of the broking industry/Financial services industry, software engineering, and product management. The ideal candidate will have a robust background in engineering leadership, a proven track record of delivering scalable technology solutions, and strong product knowledge.
Key Responsibilities:
- Engineering Strategy and Vision:
- Develop and communicate a clear engineering vision and strategy aligned with our broking and funds management platform.
- Conduct market research and technical analysis to identify trends, opportunities, and customer needs within the broking industry.
- Define and prioritize the engineering roadmap, ensuring alignment with business goals and customer requirements.
- Lead cross-functional engineering teams (software development, QA, DevOps, etc.) to deliver high-quality products on time and within budget.
- Oversee the entire software development lifecycle, from planning and architecture to development and deployment, ensuring robust and scalable solutions.
- Write detailed technical specifications and guide the engineering teams to ensure clarity and successful execution.
- Leverage your understanding of the broking industry to guide product development and engineering efforts.
- Collaborate with product managers to incorporate industry-specific requirements and ensure the platform meets the needs of brokers, traders, and financial institutions.
- Stay updated with regulatory changes, market trends, and technological advancements within the broking sector.
- Mentor and lead a high-performing engineering team, fostering a culture of innovation, collaboration, and continuous improvement.
- Recruit, train, and retain top engineering talent to build a world-class development team.
- Conduct regular performance reviews and provide constructive feedback to team members.
- Define and track key performance indicators (KPIs) for engineering projects to ensure successful delivery and performance.
- Analyze system performance, user data, and platform metrics to identify areas for improvement and optimization.
- Prepare and present engineering performance reports to senior management and stakeholders.
- Work closely with product managers, sales, marketing, and customer support teams to align engineering efforts with overall business objectives.
- Provide technical guidance and support to sales teams to help them understand the platform's capabilities and competitive advantages.
- Engage with customers, partners, and stakeholders to gather feedback, understand their needs, and validate engineering solutions.
Requirements:
- BE /B. Tech - Computer Science from a top engineering college
- MBA a plus, not required
- 5+ years of experience in software engineering, with at least 2+ years in a leadership role.
- Strong understanding of the broking industry and financial services industry.
- Proven track record of successfully managing and delivering complex software products.
- Excellent communication, presentation, and interpersonal skills.
- Strong analytical and problem-solving abilities.
- Experience with Agile/Scrum methodologies.
- Deep understanding of software architecture, cloud computing, and modern development practices.
Technical Expertise:
- Front-End: React, Next.js, JavaScript, HTML5, CSS3
- Back-End: Node.js, Express.js, RESTful APIs
- Database: MySQL, PostgreSQL, MongoDB
- DevOps: Docker, Kubernetes, AWS (EC2, S3, RDS), CI/CD pipelines
- Version Control: Git, GitHub/GitLab
- Other: TypeScript, Webpack, Babel, ESLint, Redux
Preferred Qualifications:
- Experience in the broking or financial services industry.
- Familiarity with data analytics tools and methodologies.
- Knowledge of user experience (UX) design principles.
- Experience with trading platforms or financial technology products.
This role is ideal for someone who combines strong technical expertise with a deep understanding of the broking industry and a passion for delivering high-impact software solutions.
Embedos is looking for super heroes, who can help us succeed in our endeavour of becoming a beacon for providing problem solving Industrial IoT Solutions.
Location: MUMBAI
VACANCY: 3 - 4
Embedos makes Controllers –Interface devices and cloud based Software solutions for Remote Monitoring and Control, Industry 4.0 Applications.
We are looking for Engineering super heroes, who have a flare and interest in Core hardware / firmware / embedded software/ Networking and web technologies.
We would want engineers who have wide interests and want to work on multiple specializations. Functions in the embedded domain
• Hardware design small signal /Tele communication/interface electronics/Digital /Latest Microprocessors STM , ESP ,interfaces , I2C, SPI / Peripherals / Schematics /PCB Routing
• Programming languages for embedded devices respective IDEs s, debugging systems
• RTOS, Real time programming concepts.
• Linux Kernel programming, peripheral drivers.
• Communication protocols like Modbus, CAN, OPC other industrial protocols.
• Open source software, documentation, versioning systems.
• Web technology, Web applications, Networking technology, Cloud Interfacing.
We invite you to come and join in our Core team to make this endeavour a success and share the rewards.
Embedos is looking for Super Heroes to work on cutting edge technology involving interfacing IoT enabled Firmware, cloud computing software, generating exciting user interfaces, developing API’s, designing web app architectures, deploying re - usable code and the works.
Company: CorpCare
Title: Head of Engineering/ Head of Product
Location: Mumbai (work from office)
CTC: Annual CTC Up to 25 Lacs
About Us:
CorpCare is India’s first all-in-one corporate funds and assets management platform. We offer a single-window solution for corporates, family offices, and HNIs. We assist corporates in formulating and managing treasury management policies and conducting reviews with investment committees and the board.
Job Summary:
The Head of Engineering will be responsible for overseeing the development, implementation, and management of our corporate funds and assets management platform. This role demands a deep understanding of the broking industry/Financial services industry, software engineering, and product management. The ideal candidate will have a robust background in engineering leadership, a proven track record of delivering scalable technology solutions, and strong product knowledge.
Key Responsibilities:
- Develop and communicate a clear engineering vision and strategy aligned with our broking and funds management platform.
- Conduct market research and technical analysis to identify trends, opportunities, and customer needs within the broking industry.
- Define and prioritize the engineering roadmap, ensuring alignment with business goals and customer requirements.
- Lead cross-functional engineering teams (software development, QA, DevOps, etc.) to deliver high-quality products on time and within budget.
- Oversee the entire software development lifecycle, from planning and architecture to development and deployment, ensuring robust and scalable solutions.
- Write detailed technical specifications and guide the engineering teams to ensure clarity and successful execution.
- Leverage your understanding of the broking industry to guide product development and engineering efforts.
- Collaborate with product managers to incorporate industry-specific requirements and ensure the platform meets the needs of brokers, traders, and financial institutions.
- Stay updated with regulatory changes, market trends, and technological advancements within the broking sector.
- Mentor and lead a high-performing engineering team, fostering a culture of innovation, collaboration, and continuous improvement.
- Recruit, train, and retain top engineering talent to build a world-class development team.
- Conduct regular performance reviews and provide constructive feedback to team members.
- Define and track key performance indicators (KPIs) for engineering projects to ensure successful delivery and performance.
- Analyze system performance, user data, and platform metrics to identify areas for improvement and optimization.
- Prepare and present engineering performance reports to senior management and stakeholders.
- Work closely with product managers, sales, marketing, and customer support teams to align engineering efforts with overall business objectives.
- Provide technical guidance and support to sales teams to help them understand the platform's capabilities and competitive advantages.
- Engage with customers, partners, and stakeholders to gather feedback, understand their needs, and validate engineering solutions.
Requirements:
- BE /B. Tech - Computer Science
- MBA a plus, not required
- 5+ years of experience in software engineering, with at least 2+ years in a leadership role.
- Strong understanding of the broking industry and financial services industry.
- Proven track record of successfully managing and delivering complex software products.
- Excellent communication, presentation, and interpersonal skills.
- Strong analytical and problem-solving abilities.
- Experience with Agile/Scrum methodologies.
- Deep understanding of software architecture, cloud computing, and modern development practices.
Technical Expertise:
- Front-End: React, Next.js, JavaScript, HTML5, CSS3
- Back-End: Node.js, Express.js, RESTful APIs
- Database: MySQL, PostgreSQL, MongoDB
- DevOps: Docker, Kubernetes, AWS (EC2, S3, RDS), CI/CD pipelines
- Version Control: Git, GitHub/GitLab
- Other: TypeScript, Webpack, Babel, ESLint, Redux
Preferred Qualifications:
- Experience in the broking or financial services industry.
- Familiarity with data analytics tools and methodologies.
- Knowledge of user experience (UX) design principles.
- Experience with trading platforms or financial technology products.
This role is ideal for someone who combines strong technical expertise with a deep understanding of the broking industry and a passion for delivering high-impact software solutions.
Job Description: Python Backend Developer
Experience: 7-12 years
Job Type: Full-time
Job Overview:
Wissen Technology is looking for a highly experienced Python Backend Developer with 7-12 years of experience to join our team. The ideal candidate will have deep expertise in backend development using Python, with a strong focus on Django and Flask frameworks.
Key Responsibilities:
- Develop and maintain robust backend services and APIs using Python, Django, and Flask.
- Design scalable and efficient database schemas, integrating with both relational and NoSQL databases.
- Collaborate with front-end developers and other team members to establish objectives and design functional, cohesive code.
- Optimize applications for maximum speed and scalability.
- Ensure security and data protection protocols are implemented effectively.
- Troubleshoot and debug applications to ensure a seamless user experience.
- Participate in code reviews, testing, and quality assurance processes.
Required Skills:
Python: Extensive experience in backend development using Python.
Django & Flask: Proficiency in Django and Flask frameworks.
Database Management: Strong knowledge of databases such as PostgreSQL, MySQL, and MongoDB.
API Development: Expertise in building and maintaining RESTful APIs.
Security: Understanding of security best practices and data protection measures.
Version Control: Experience with Git for collaboration and version control.
Problem-Solving: Strong analytical skills with a focus on writing clean, efficient code.
Communication: Excellent communication and teamwork skills.
Preferred Qualifications:
- Experience with cloud services like AWS, Azure, or GCP.
- Familiarity with Docker and containerization.
- Knowledge of CI/CD practices.
Why Join Wissen Technology?
- Opportunity to work on innovative projects with a cutting-edge technology stack.
- Competitive compensation and benefits package.
- A supportive environment that fosters professional growth and learning.
- Architectural Leadership:
- Design and architect robust, scalable, and high-performance Hadoop solutions.
- Define and implement data architecture strategies, standards, and processes.
- Collaborate with senior leadership to align data strategies with business goals.
- Technical Expertise:
- Develop and maintain complex data processing systems using Hadoop and its ecosystem (HDFS, YARN, MapReduce, Hive, HBase, Pig, etc.).
- Ensure optimal performance and scalability of Hadoop clusters.
- Oversee the integration of Hadoop solutions with existing data systems and third-party applications.
- Strategic Planning:
- Develop long-term plans for data architecture, considering emerging technologies and future trends.
- Evaluate and recommend new technologies and tools to enhance the Hadoop ecosystem.
- Lead the adoption of big data best practices and methodologies.
- Team Leadership and Collaboration:
- Mentor and guide data engineers and developers, fostering a culture of continuous improvement.
- Work closely with data scientists, analysts, and other stakeholders to understand requirements and deliver high-quality solutions.
- Ensure effective communication and collaboration across all teams involved in data projects.
- Project Management:
- Lead large-scale data projects from inception to completion, ensuring timely delivery and high quality.
- Manage project resources, budgets, and timelines effectively.
- Monitor project progress and address any issues or risks promptly.
- Data Governance and Security:
- Implement robust data governance policies and procedures to ensure data quality and compliance.
- Ensure data security and privacy by implementing appropriate measures and controls.
- Conduct regular audits and reviews of data systems to ensure compliance with industry standards and regulations.
Company Description
CorpCare is India’s first all-in-one corporate funds and assets management platform based in Mumbai. We offer a single window solution for corporates, family offices, and HNIs to formulate and manage treasury management policies. Our portfolio management system provide assistance in conducting reviews with investment committees and the board.
Role Description
- Role- Python Developer
- CTC- Upto 12 LPA
This is a full-time on-site role for a Python Developer located in Mumbai. The Python Developer will be responsible for back-end web development, software development, object-oriented programming (OOP), programming, and databases. The Python Developer will also be responsible for performing system analysis and creating robust and scalable software solutions.
Qualifications
- 2+ years of work experience with Python (Programming Language)
- Expertise in Back-End Web Development • Proficiency in Software Development specially in Django framework, Fast API, Rest APIs, AWS
- Experience in Programming and Databases
- Understanding of Agile development methodologies
- Excellent problem-solving and analytical skills
- Ability to work in a team environment
- Bachelor's or Master's degree in Computer Science or relevant field
- Relevant certifications in Python and related frameworks are preferred
Description of Role
- We are looking for an enthusiastic analytically minded data driven BI Consultant/Developer role in Mumbai based Data & Analytics team
- The Business Intelligence consultant/developer is a member of the analytics team whose focus is to provide hands-on consulting and
- · Understand and document the reporting business requirements, processes and workflows developing both written and visual depictions of requirements and process flows.
- · Work closely with BI product manager & lead dashboard design and information presentation ideation meetings with business stakeholders.
- Works extensively in Tableau Desktop, MS Power BI designing and building highly customized dashboards to provide stunning visual anal build assets like datasets, reports and dashboards using tools like Tableau, Powerbi & dataiku for internal customers within investment (Performance, Marketing, Sales etc.)
Key responsibilities
- · Builds Tableau/Power BI Data Models utilizing best practices, ensures data accuracy and ability to blend with other certified data sources.
- · Understand Analytics & Business Intelligence framework and partner with Analytics Technology & Enterprise reporting Technology team to build & deploy reports and dashboards.
- · Evaluate data sources and technology options, recommend appropriate solutions.
- · Build and maintain a reporting data dictionary and data lineage.
- · Perform data analysis, data mapping, and develop ad hoc queries.
- · Write SQL queries, stored procedures, and scripts, ETL jobs.
- · Perform QA testing of developed deliverables and assist with User Acceptance Testing
- · Manage the prioritization, logging and tracking of user related issues, defects, enhancements, and work requests.
Experience and Skills
- · Analytic and data driven mindset with a finesse to build dashboards that tell a story.
- · Strong communication skills, interacting effectively with quantitative colleagues as well as less technical audiences.
- · Minimum of 5 years of proven experience creating Dashboards, Reports and Visualizations with interactive capabilities.
- · Minimum of 5 years’ experience in business/data analysis and data preparation
- · Minimum 2 years of experience in working with big data & building data apps using Python.
- · Broad industry knowledge of investment management a plus
- · Excellent verbal and writing skills in English.
- · Excellent time management and ability to work under pressure to meet tight deadlines.
- · Maintain a strong commitment to quality.
Description of Role:
We are looking for a career-minded professional with global perspective to join the Mumbai based Data & Analytics Group (DAG).
Key responsibilities:
As part of the Data & Analytics Group (DAG) and reporting to the Head of the India DAG locally, the individual is responsible for the following –
1.Review, analyze, and resolve data quality issues for enterprise core data in the IM Data Warehouse.
2.Coordinate with data owners and other teams to identify root-cause of data quality issues and implement solutions.
3.Coordinate the onboarding of data from various internal / external sources into the central repository.
4.Work closely with Data Owners/Owner delegates on data analysis and development data quality (DQ) rules. Work with IT on enhancing DQ controls.
5.End-to-end analysis of business processes, data flows, and data usage to improve business productivity through re-engineering and data governance.
6.Interact with business stakeholders to identify, prioritize, and address data-related needs and issues.
7.Document business requirements and use cases for data-related projects.
8.Manage change control process and participate in user acceptance testing (UAT) activities.
9.Support other DAG initiatives.
Key Skills:
1.Ability to interact with business and technology teams to understand processes and data usage.
2.Ability to do data analysis - to trace data from source to consumption.
3.Capable of working across organization and as part of a cross-functional virtual team. Ability to work and think independently, but within a team-based approach.
4.Problem solver, self-starter, ability to work through an entire issue lifecycle. Ability to effectively prioritize and multi-task.
5.Strong communication skills
Key qualifications:
1.Bachelor’s Degree required and any other relevant academic course a plus.
2.Strong domain knowledge of investment data
3.5+ years of data management, data analytics, or data governance experience in financial services
4.Experience in data analysis, exploratory analysis using SQL and formulating data quality rules.
5.Experience working with BI reporting tools like Tableau, Power BI is preferred.
6.Knowledge in coding, Python is a plus.
7.Prior experience working with Data Platforms like Aladdin, FactSet, Bloomberg, MDM platforms preferred.
Job Description :
● Designation : Full Stack Developer
● Industry : Software Development
● Role category : Software Developer
● Education : Graduate
● Total Experience years : 3-5 Years
● Relevant Experience years : 3-5 Years
● Main skill require: : Python, Django, Javascript, react.js
● Gender : Male / Female
● Salary range : 6L to 12 L Per annum
● Job location : Navi Mumbai / Bangalore
● Office time : 11 AM to 8 PM
Hiring Process:
Following are some guidelines for the hiring process. It lists out the
expected experience and skill set required for the position. In terms of
the hiring procedure, our first step involves arranging a video
interview with the candidates who have been shortlisted.
Subsequently, we will proceed to schedule an in-office interview.
Resume Requirements
● Detailed Technical Skillset
● Detailed description of the Project
● Description of the modules/system which the developer was directly
involved in the Development.
Additional Attributes, if available
● Description of Technical Challenges faced and It’s implementation
● Link to their Code Repository (github, bitbucket etc)
● Link to the project
Full Stack Developer
Experience Requirement :
3-5 years of experience in Javascript, ReactJS and Python/Django
Technical skills Frontend:
● Strong proficiency in JavaScript, including DOM manipulation and the JavaScript object model
● Thorough understanding of React.js and its core principles, React
Hooks, Context, High Order Components
● Prior experience with popular React.js workflows (such as Flux or
Redux)
● Familiarity with more current specifications of EcmaScript
● Prior experience with data structure libraries (e.g., Immutable.js)
● Familiarity with RESTful APIs
● Familiarity with HTML / CSS
● Experience with common front-end development tools such as Babel, Webpack, NPM, Etc.
Additional Skills – Not Compulsory -
● GIT experience is a plus
● Knowledge of isomorphic React is a plus
● Knowledge of modern authorization mechanisms, such as JSON Web Token
● Familiarity with modern front-end build pipelines and tools
● Ability to understand business requirements and translate them into technical requirements
● A knack for benchmarking and optimization
Technical skills Backend:
We are looking for a Python/Django developer who is well versed in Python language as well as in use of the Django framework. Knowledge of other python web frameworks is an advantage.
Skills needed
● Good understanding of Python language(3 + years experience)
● Proficient in Django Development Framework
● Good understanding of REST Architecture
● Good understanding and experience in writing of regular Expressions
● Familiarity with some ORM (Object Relational Mapper) libraries
● Hands on Experience with deployment of application
● Knowledge of user authentication and authorisation between multiple systems, servers and environments
● Understanding of fundamental design principles behind a scalable & distributed application
Primary Skills
DynamoDB, Java, Kafka, Spark, Amazon Redshift, AWS Lake Formation, AWS Glue, Python
Skills:
Good work experience showing growth as a Data Engineer.
Hands On programming experience
Implementation Experience on Kafka, Kinesis, Spark, AWS Glue, AWS Lake Formation.
Excellent knowledge in: Python, Scala/Java, Spark, AWS (Lambda, Step Functions, Dynamodb, EMR), Terraform, UI (Angular), Git, Mavena
Experience of performance optimization in Batch and Real time processing applications
Expertise in Data Governance and Data Security Implementation
Good hands-on design and programming skills building reusable tools and products Experience developing in AWS or similar cloud platforms. Preferred:, ECS, EKS, S3, EMR, DynamoDB, Aurora, Redshift, Quick Sight or similar.
Familiarity with systems with very high volume of transactions, micro service design, or data processing pipelines (Spark).
Knowledge and hands-on experience with server less technologies such as Lambda, MSK, MWAA, Kinesis Analytics a plus.
Expertise in practices like Agile, Peer reviews, Continuous Integration
Roles and responsibilities:
Determining project requirements and developing work schedules for the team.
Delegating tasks and achieving daily, weekly, and monthly goals.
Responsible for designing, building, testing, and deploying the software releases.
Salary: 25LPA-40LPA
Job Description:
· Proficient In Python.
· Good knowledge of Stress/Load Testing and Performance Testing.
· Knowledge in Linux.
We are looking for QA role who has experience into Python ,AWS,and chaos engineering tool(Monkey,Gremlin)
⦁ Strong understanding of distributed systems
- Cloud computing (AWS), and networking principles.
- Ability to understand complex trading systems and prepare and execute plans to induce failures
- Python.
- Experience with chaos engineering tooling such as Chaos Monkey, Gremlin, or similar
Technical Skills:
- Ability to understand and translate business requirements into design.
- Proficient in AWS infrastructure components such as S3, IAM, VPC, EC2, and Redshift.
- Experience in creating ETL jobs using Python/PySpark.
- Proficiency in creating AWS Lambda functions for event-based jobs.
- Knowledge of automating ETL processes using AWS Step Functions.
- Competence in building data warehouses and loading data into them.
Responsibilities:
- Understand business requirements and translate them into design.
- Assess AWS infrastructure needs for development work.
- Develop ETL jobs using Python/PySpark to meet requirements.
- Implement AWS Lambda for event-based tasks.
- Automate ETL processes using AWS Step Functions.
- Build data warehouses and manage data loading.
- Engage with customers and stakeholders to articulate the benefits of proposed solutions and frameworks.
Data Scientist – Program Embedded
Job Description:
We are seeking a highly skilled and motivated senior data scientist to support a big data program. The successful candidate will play a pivotal role in supporting multiple projects in this program covering traditional tasks from revenue management, demand forecasting, improving customer experience to testing/using new tools/platforms such as Copilot Fabric for different purpose. The expected candidate would have deep expertise in machine learning methodology and applications. And he/she should have completed multiple large scale data science projects (full cycle from ideation to BAU). Beyond technical expertise, problem solving in complex set-up will be key to the success for this role. This is a data science role directly embedded into the program/projects, stake holder management and collaborations with patterner are crucial to the success on this role (on top of the deep expertise).
What we are looking for:
- Highly efficient in Python/Pyspark/R.
- Understand MLOps concepts, working experience in product industrialization (from Data Science point of view). Experience in building product for live deployment, and continuous development and continuous integration.
- Familiar with cloud platforms such as Azure, GCP, and the data management systems on such platform. Familiar with Databricks and product deployment on Databricks.
- Experience in ML projects involving techniques: Regression, Time Series, Clustering, Classification, Dimension Reduction, Anomaly detection with traditional ML approaches and DL approaches.
- Solid background in statistics, probability distributions, A/B testing validation, univariate/multivariate analysis, hypothesis test for different purpose, data augmentation etc.
- Familiar with designing testing framework for different modelling practice/projects based on business needs.
- Exposure to Gen AI tools and enthusiastic about experimenting and have new ideas on what can be done.
- If they have improved an internal company process using an AI tool, that would be great (e.g. process simplification, manual task automation, auto emails)
- Ideally, 10+ years of experience, and have been on independent business facing roles.
- CPG or retail as a data scientist would be nice, but not number one priority, especially for those who have navigated through multiple industries.
- Being proactive and collaborative would be essential.
Some projects examples within the program:
- Test new tools/platforms such as Copilo, Fabric for commercial reporting. Testing, validation and build trust.
- Building algorithms for predicting trend in category, consumptions to support dashboards.
- Revenue Growth Management, create/understand the algorithms behind the tools (can be built by 3rd parties) we need to maintain or choose to improve. Able to prioritize and build product roadmap. Able to design new solutions and articulate/quantify the limitation of the solutions.
- Demand forecasting, create localized forecasts to improve in store availability. Proper model monitoring for early detection of potential issues in the forecast focusing particularly on improving the end user experience.
Data Scientist – Delivery & New Frontiers Manager
Job Description:
We are seeking highly skilled and motivated data scientist to join our Data Science team. The successful candidate will play a pivotal role in our data-driven initiatives and be responsible for designing, developing, and deploying data science solutions that drives business values for stakeholders. This role involves mapping business problems to a formal data science solution, working with wide range of structured and unstructured data, architecture design, creating sophisticated models, setting up operations for the data science product with the support from MLOps team and facilitating business workshops. In a nutshell, this person will represent data science and provide expertise in the full project cycle. Expectation of the successful candidate will be above that of a typical data scientist. Beyond technical expertise, problem solving in complex set-up will be key to the success for this role.
Responsibilities:
- Collaborate with cross-functional teams, including software engineers, product managers, and business stakeholders, to understand business needs and identify data science opportunities.
- Map complex business problems to data science problem, design data science solution using GCP/Azure Databricks platform.
- Collect, clean, and preprocess large datasets from various internal and external sources.
- Streamlining data science process working with Data Engineering, and Technology teams.
- Managing multiple analytics projects within a Function to deliver end-to-end data science solutions, creation of insights and identify patterns.
- Develop and maintain data pipelines and infrastructure to support the data science projects
- Communicate findings and recommendations to stakeholders through data visualizations and presentations.
- Stay up to date with the latest data science trends and technologies, specifically for GCP companies
Education / Certifications:
Bachelor’s or Master’s in Computer Science, Engineering, Computational Statistics, Mathematics.
Job specific requirements:
- Brings 5+ years of deep data science experience
∙ Strong knowledge of machine learning and statistical modeling techniques in a in a clouds-based environment such as GCP, Azure, Amazon
- Experience with programming languages such as Python, R, Spark
- Experience with data visualization tools such as Tableau, Power BI, and D3.js
- Strong understanding of data structures, algorithms, and software design principles
- Experience with GCP platforms and services such as Big Query, Cloud ML Engine, and Cloud Storage
- Experience in configuring and setting up the version control on Code, Data, and Machine Learning Models using GitHub.
- Self-driven, be able to work with cross-functional teams in a fast-paced environment, adaptability to the changing business needs.
- Strong analytical and problem-solving skills
- Excellent verbal and written communication skills
- Working knowledge with application architecture, data security and compliance team.
at Accrete
Responsibilities:
- Collaborating with data scientists and machine learning engineers to understand their requirements and design scalable, reliable, and efficient machine learning platform solutions.
- Building and maintaining the applications and infrastructure to support end-to-end machine learning workflows, including inference and continual training.
- Developing systems for the definition deployment and operation of the different phases of the machine learning and data life cycles.
- Working within Kubernetes to orchestrate and manage containers, ensuring high availability and fault tolerance of applications.
- Documenting the platform's best practices, guidelines, and standard operating procedures and contributing to knowledge sharing within the team.
Requirements:
- 3+ years of hands-on experience in developing and managing machine learning or data platforms
- Proficiency in programming languages commonly used in machine learning and data applications such as Python, Rust, Bash, Go
- Experience with containerization technologies, such as Docker, and container orchestration platforms like Kubernetes.
- Familiarity with CI/CD pipelines for automated model training and deployment. Basic understanding of DevOps principles and practices.
- Knowledge of data storage solutions and database technologies commonly used in machine learning and data workflows.
Responsibilities
- Work on execution and scheduling of all tasks related to assigned projects' deliverable dates
- Optimize and debug existing codes to make them scalable and improve performance
- Design, development, and delivery of tested code and machine learning models into production environments
- Work effectively in teams, managing and leading teams
- Provide effective, constructive feedback to the delivery leader
- Manage client expectations and work with an agile mindset with machine learning and AI technology
- Design and prototype data-driven solutions
Eligibility
- Highly experienced in designing, building, and shipping scalable and production-quality machine learning algorithms in the field of Python applications
- Working knowledge and experience in NLP core components (NER, Entity Disambiguation, etc.)
- In-depth expertise in Data Munging and Storage (Experienced in SQL, NoSQL, MongoDB, Graph Databases)
- Expertise in writing scalable APIs for machine learning models
- Experience with maintaining code logs, task schedulers, and security
- Working knowledge of machine learning techniques, feed-forward, recurrent and convolutional neural networks, entropy models, supervised and unsupervised learning
- Experience with at least one of the following: Keras, Tensorflow, Caffe, or PyTorch
Dear Connections,
We are hiring! Join our dynamic team as a QA Automation Tester (Python, Java, Selenium, API, SQL, Git)! We're seeking a passionate professional to contribute to our innovative projects. If you thrive in a collaborative environment, possess expertise in Python, Java, Selenium, and Robot Framework, and are ready to make an impact, apply now! Wissen Technology is committed to fostering innovation, growth, and collaboration. Don't miss this chance to be part of something extraordinary.
Company Overview:
Wissen is the preferred technology partner for executing transformational projects and accelerating implementation through thought leadership and a solution mindset. It is a leading IT solutions and consultancy firm dedicated to providing innovative and customized solutions to global clients. We leverage cutting-edge technologies to empower businesses and drive digital transformation.
#jobopportunity #hiringnow #joinourteam #career #wissen #QA #automationtester #robot #apiautomation #sql #java #python #selenium
The Role:
As an ML Engineer at TIFIN, you will be responsible for driving research and innovation in a result-oriented direction. Your role will involve staying updated on the latest research trends and exploring advancements in Natural Language Understanding (NLU) and their applications in conversational AI. You will also play a mentoring role and contribute to improving NLU capabilities using transfer learning from historical conversational data.
Requirements:
- Experience with training and fine-tuning machine learning models on large text datasets.
- Strong computer science fundamentals and at least 3 years of software development experience.
- Track record of thinking big and finding simple solutions while dealing with ambiguity.
- Proven experience as a Natural Language Processing Engineer or in similar roles.
- Good understanding of NLP tricks and techniques for semantic extraction, data structure, and modeling.
- Familiarity with text representation techniques, algorithms, and statistics.
- Knowledge of programming languages such as R, Python, and Java.
- Proficiency in Machine Learning frameworks like TensorFlow, PyTorch, etc.
- Ability to design software architectures and solve complex problems.
- Strong analytical and problem-solving skills.
- Experience in projects related to information retrieval, machine comprehension, entity recognition, text classification, semantic frame parsing, or machine translation is a plus.
- Publications, patents, or conference talks in relevant fields are a bonus.
JD-Backend Developer
Job Location: Andheri, Mumbai
Job Summary:
As a Python based Backend developer, you shall be responsible for software development of greenfield projects -an platform
for automated testing using Python as a main language. You will work with a team of developers, collaborating on projects to
create efficient and effective software solutions that meet the needs of clients or end-users in the testing domain. You will
need to have a strong understanding of Python & other languages with knowledge of test automation frameworks and tools/
libraries.
Responsibilities:
• Develop and maintain an automated testing platform using Python programming language.
• Design and implement software solutions that automate the testing process.
• Collaborate with other developers and cross-functional teams to deliver high-quality software solutions.
• Participate in code reviews, debugging, and troubleshooting to improve software quality.
• Stay up-to-date with the latest developments in Python and JavaScript programming language and related
testing frameworks and tools.
• Ensure software is delivered on-time and meets or exceeds customer expectations.
• Develop and maintain documentation related to software development processes and projects.
Requirements:
• Bachelor's degree in Computer Science, Software Engineering or a related field.
• Proven experience as a Back-end Developer.
• Strong understanding of Python & other programming language along with its ecosystem (libraries, frameworks,
tools, etc.).
• Excellent written and verbal communication skills.
Preferences:
• Preference shall be given to local candidates
• Preference shall be given to candidates having experience in greenfield software development projects related
to automated testing.
JOB TITLE/HEADLINE: Backend Developer
EMPLOYMENT TYPE: Full time, permanent.
REPORTING TO: Ajit Kumar
LOCATION: Andheri, Mumbai
Backend Developer
Position Overview:
As a Backend Developer at FatakPay, you will play a crucial role in designing, implementing, testing, and maintaining the backend systems that power our applications. You will collaborate with cross-functional teams to develop scalable and efficient server-side applications, ensuring high performance and responsiveness.
Responsibilities:
Design, develop, and maintain robust and scalable backend systems.
Collaborate with front-end developers, designers, and other stakeholders to integrate user-facing elements with server-side logic.
Implement and maintain data storage solutions, ensuring optimal performance and data integrity.
Collaborate with DevOps to deploy and manage applications in a cloud environment.
Write clean, efficient, and maintainable code, following best practices and coding standards.
Conduct code reviews and provide constructive feedback to team members. Identify and address performance bottlenecks, security issues, and other technical challenges.
Stay updated on emerging trends and technologies in backend development.
Qualifications:
Bachelor's degree in Computer Science, Software Engineering, or a related field. Proven experience as a Backend Developer or similar role.
Strong proficiency in server-side programming languages such as Java, Python, or Node.js.
Experience with database systems such as MySQL, PostgreSQL, or MongoDB. Familiarity with RESTful API design and integration.
Knowledge of cloud platforms like AWS, Azure, or Google Cloud. Understanding of code versioning tools, such as Git.
Excellent problem-solving and communication skills. Ability to work collaboratively in a team environment.
ABOUT THE COMPANY:
FatakPay Digital Private Ltd (FatakPay) is a digital-only lending platform that provides virtual credit facilities. The solution provides a 100% digital and paperless quick, transparent and secure way to transact in multilingual format with a ‘scan now, pay later’ facility and easy repayment options.
We're a company that strongly believes in teamwork, design, creativity and tech. We love to build the best possible Financial Tech products that make the world a better place.
WHAT HAPPENS NEXT?
If selected for an interview, you will hear directly from someone from the HR department, usually within a week of the closing date.
Job Title: SQL Query Writer - Analytics Automation
Location: Thane (West), Mumbai
Experience: 4-5 years
Responsibilities:
- Develop and optimize SQL queries for efficient data retrieval and analysis.
- Automate analytics processes using platforms like SQL, Python, ACL, Alteryx, Analyzer, Excel Macros, and Access Query.
- Collaborate with cross-functional teams to understand analytical requirements and provide effective solutions.
- Ensure data accuracy, integrity, and security in automated processes.
- Troubleshoot and resolve issues related to analytics automation.
Qualifications:
- Minimum 3 years of experience in SQL query writing and analytics automation.
- Proficiency in SQL, Python, ACL, Alteryx, Analyzer, Excel Macros, and Access Query.
- Strong analytical skills and attention to detail.
Title:- Data Scientist
Experience:-6 years
Work Mode:- Onsite
Primary Skills:- Data Science, SQL, Python, Data Modelling, Azure, AWS, Banking Domain (BFSI/NBFC)
Qualification:- Any
Roles & Responsibilities:-
1. Acquiring, cleaning, and preprocessing raw data for analysis.
2. Utilizing statistical methods and tools for analyzing and interpreting complex datasets.
3. Developing and implementing machine learning models for predictive analysis.
4. Creating visualizations to effectively communicate insights to both technical and non-technical stakeholders.
5. Collaborating with cross-functional teams, including data engineers, business analysts, and domain experts.
6. Evaluating and optimizing the performance of machine learning models for accuracy and efficiency.
7. Identifying patterns and trends within data to inform business decision-making.
8. Staying updated on the latest advancements in data science, machine learning, and relevant technologies.
Requirement:-
1. Experience with modeling techniques such as Linear Regression, clustering, and classification techniques.
2. Must have a passion for data, structured or unstructured. 0.6 – 5 years of hands-on experience with Python and SQL is a must.
3. Should have sound experience in data mining, data analysis and machine learning techniques.
4. Excellent critical thinking, verbal and written communications skills.
5. Ability and desire to work in a proactive, highly engaging, high-pressure, client service environment.
6. Good presentation skills.
Job Title: Data Analyst with Python
Experience: 4+ years
Location: Mumbai
Working Mode: Onsite
Primary Skills: Python, Data Analysis of RDs, FDs, Saving Accounts, Banking domain, Risk Consultant, Car Loan Model, Cross-sell model
Qualification: Any graduation
Job Description
1. With 4 to 5 years of banking analytics experience
2. Data Analyst profile with good domain understanding
3. Good with Python and worked on Liabilities (Mandatory)
4. For Liabilities Should have worked on (Saving A/c, FD,RD) (Mandatory)
5. Good with stakeholder management and requirement gathering
6. Only from banking industry
ABOUT THE ROLE:
Job Description:
Overview:
We are in search of a seasoned Technical Lead with expertise in Micro services, Cloud Infrastructure,
API Integrations, Networking, Python, Database technologies, and strong proficiency in DevOps
practices. The ideal candidate will have a minimum of 5+ years of experience, showcasing a
comprehensive skill set and a track record of successful project delivery. This role requires
proficiency in cloud services, a deep understanding of DevOps principles, and experience with
database management. Additionally, cloud migration experience is considered a valuable asset.
Key Responsibilities:
• Technical Leadership:
• Lead and inspire a cross-functional team, providing technical guidance and
support.
• Ensure the successful implementation of technical solutions aligned with project goals and timelines.
• Microservices Architecture:
• Design and implement scalable and modular micro-services architecture.
• Enforce best practices in micro-services development, deployment, and
monitoring.
• Cloud Infrastructure and DevOps:
• Architect, deploy, and manage cloud infrastructure on platforms such as AWS, Azure, or Google Cloud.
• Demonstrate strong DevOps practices to streamline development, testing, and deployment processes.
• API Integrations:
• Develop and integrate APIs for seamless communication between system components.
• Prioritize security and reliability in API integrations.
• Networking:
• Design and implement network solutions for distributed systems.
• Troubleshoot and optimise network performance.
• Programming Skills:
• Proficiency in Python for backend development. Django or flask framework.
• Collaborate with the development team to ensure code quality and
adherence to best practices.
• Database Management:
• Design and optimise database schemas for performance and scalability.
• Experience with both SQL and NoSQL databases.
Mandatory Skills:
• Cloud services expertise any one (AWS, Azure, Google Cloud).
• Strong understanding and application of DevOps principles.
• Database management experience.
Preferred Skills:
• Cloud migration experience.
• Certification in relevant cloud platforms.
• Familiarity with containerisation, CI/CD and orchestration tools (Docker, Jenkins,
Kubernetes).
Qualifications:
• Bachelor's or higher degree in Computer Science, Engineering, or a related field.
• Minimum of 5+ years of experience in a technical leadership role.
• Proven expertise in micro-services architecture, cloud infrastructure, API
integrations, networking, Python, and database technologies.
ABOUT THE COMPANY:
FatakPay Digital Private Ltd (FatakPay) is a digital-only lending platform that provides virtual
credit facilities. The solution provides a 100% digital and paperless quick, transparent and secure
way to transact in multilingual format with a ‘scan now, pay later’ facility and easy repayment
options.
We're a company that strongly believes in teamwork, design, creativity and tech. We love to build
the best possible Financial Tech products that make the world a better place.
WHAT HAPPENS NEXT?
If selected for an interview, you will hear directly from someone from the HR department, usually
within a week of the closing date.
Job Title: Credit Risk Analyst
Company: FatakPay FinTech
Location: Mumbai, India
Salary Range: INR 8 - 15 Lakhs per annum
Job Description:
FatakPay, a leading player in the fintech sector, is seeking a dynamic and skilled Credit Risk Analyst to join our team in Mumbai. This position is tailored for professionals who are passionate about leveraging technology to enhance financial services. If you have a strong background in engineering and a keen eye for risk management, we invite you to be a part of our innovative journey.
Key Responsibilities:
- Conduct thorough risk assessments by analyzing borrowers' financial data, including financial statements, credit scores, and income details.
- Develop and refine predictive models using advanced statistical methods to forecast loan defaults and assess creditworthiness.
- Collaborate in the formulation and execution of credit policies and risk management strategies, ensuring compliance with regulatory standards.
- Monitor and analyze the performance of loan portfolios, identifying trends, risks, and opportunities for improvement.
- Stay updated with financial regulations and standards, ensuring all risk assessment processes are in compliance.
- Prepare comprehensive reports on credit risk analyses and present findings to senior management.
- Work closely with underwriting, finance, and sales teams to provide critical input influencing lending decisions.
- Analyze market trends and economic conditions, adjusting risk assessment models and strategies accordingly.
- Utilize cutting-edge financial technologies for more efficient and accurate data analysis.
- Engage in continual learning to stay abreast of new tools, techniques, and best practices in credit risk management.
Qualifications:
- Minimum qualification: B.Tech or Engineering degree from a reputed institution.
- 2-4 years of experience in credit risk analysis, preferably in a fintech environment.
- Proficiency in data analysis, statistical modeling, and machine learning techniques.
- Strong analytical and problem-solving skills.
- Excellent communication skills, with the ability to present complex data insights clearly.
- A proactive approach to work in a fast-paced, technology-driven environment.
- Up-to-date knowledge of financial regulations and compliance standards.
We look forward to discovering how your expertise and innovative ideas can contribute to the growth and success of FatakPay. Join us in redefining the future of fintech!
1. Bridging the gap between IT and the business using data analytics to assess processes, determine requirements and deliver data-driven recommendations and reports to executives and stakeholders.
2. Ability to search, extract, transform and load data from various databases, cleanse and refine data until it is fit-for-purpose
3. Work within various time constraints to meet critical business needs, while measuring and identifying activities performed and ensuring service requirements are met
4. Prioritization of issues to meet deadlines while ensuring high-quality delivery
5. Ability to pull data and to perform ad hoc reporting and analysis as needed
6. Ability to adapt quickly to new and changing technical environments as well as strong analytical and problem-solving abilities
7. Strong interpersonal and presentation skills
SKILLS:
1. Advanced skills in designing reporting interfaces and interactive dashboards in Google Sheets and Excel
2. Experience working with senior decision-makers
3. Strong advanced SQL/MySQL and Python skills with the ability to fetch data from the Data Warehouse as per the stakeholder's requirement
4. Good Knowledge and experience in Excel VBA and advanced excel
5. Good Experience in building Tableau analytical Dashboards as per the stake holder's reporting requirements
6. Strong communication/interpersonal skills
PERSONA:
1. Experience in working on adhoc requirements
2. Ability to toggle around with shifting priorities
3. Experience in working for Fintech or E-commerce industry is preferable
4. Engineering 2+ years of experience as a Business Analyst for the finance processes
About UpSolve
Work on cutting-edge tech stack. Build innovative solutions. Computer Vision, NLP, Video Analytics and IOT.
Job Role
- Ideate use cases to include recent tech releases.
- Discuss business plans and assist teams in aligning with dynamic KPIs.
- Design solution architecture from input to infrastructure and services used to data store.
Job Requirements
- Working knowledge about Azure Cognitive Services.
- Project Experience in building AI solutions like Chatbots, sentiment analysis, Image Classification, etc.
- Quick Learner and Problem Solver.
Job Qualifications
- Work Experience: 2 years +
- Education: Computer Science/IT Engineer
- Location: Mumbai
Minimum of 8 years of experience of which, 4 years should be of applied data mining
experience in disciplines such as Call Centre Metrics.
Strong experience in advanced statistics and analytics including segmentation, modelling, regression, forecasting etc.
Experience with leading and managing large teams.
Demonstrated pattern of success in using advanced quantitative analytic methods to solve business problems.
Demonstrated experience with Business Intelligence/Data Mining tools to work with
data, investigate anomalies, construct data sets, and build models.
Critical to share details on projects undertaken (preferably on telecom industry)
specifically through analysis from CRM.
Roles and Responsibilities:
- Strong experience with programming microcontrollers like Arduino, ESP32, and ESP8266.
- Experience with Embedded C/C++.
- Experience with Raspberry Pi, Python, and OpenCV.
- Experience with Low power Devices would be preferred
- Knowledge about communication protocols (UART, I2C, etc.)
- Experience with Wi-Fi, LoRa, GSM, M2M, SImcom, and Quactel Modules.
- Experience with 3d modeling (preferred).
- Experience with 3d printers (preferred).
- Experience with Hardware design and knowledge of basic electronics.
- Experience with Software will be preferred.ss
Detailed Job role (daily basis) done by the IOT developer.
· Design hardware that meets the needs of the application.
· Support for current hardware, testing, and bug-fixing.
· Create, maintain, and document microcontroller code.
· prototyping, testing, and soldering
· Making 3D/CAD models for PCBs.
Bookeventz is looking for a rock star coder to work closely with the founders to build on our core platform and mobile applications. A Techie who has an Expert understanding of Server Logics, REST APIs and core web technologies.
Should have a great understanding of MEAN/MERN Stack.
Our Tech Stack:
PHP, MySql, CodeIgniter, React.js, Node.js, ExpressJS, React Native, Javascript, HTML5, CSS3, REST API, AWS Server Management (RDS, Lambda, API Gateway), Ubuntu, Git, CI/CD Good To have: Python, Payment Gateway.
Few Challenges we are working on right now:
- Working on various optimization to improve page speed.
- Improving Server Response & Load balancing.
- Server-side development on various projects.
- Customized CRM to increase leads handled per salesperson to double a day.
- Create a platform for users to create their custom event website.
Job Objectives:
- Integration of user- facing elements developed by front- end developers with server- side logic
- Optimize web applications to maximize speed and scale. Support diverse clients from high- powered desktop computers to small mobile devices
- Optimization of the application for maximum speed and scalability
- Implementation of security and data protection
- Collaborate with senior management, operations & business team
- Building Rest APIs and maintain Database optimizations
Looking For:
- Great understanding of Node, react, Express, Socket.io, MVVM framework
- Worked on e- commerce website, server handling and deployments scripts
- The one who has worked on customer facing product for 2 years
- Strong knowledge of MEAN stack (Min 3 yrs professional working experience) with basic understanding of LAMP.
- Experience designing scalable, fault tolerant systems and databases.
at NeoSoft Technologies (A CMMi Level 5 Organization)
- Minimum 2.5 years of experience as a Python Developer.
- Minimum 2.5 years of experience in any framework like Django/Flask/Fast API
- Minimum 2.5 years of experience in SQL/ Postgress
- Minimum 2.5 years of experience in Git/Gitlab/Bit-Bucket
- Minimum 2+ years of experience in deployment (CICD with Jenkins)
- Minimum 2.5 years of experience in any cloud like AWS/GCP/Azure
Lifespark is looking for individuals with a passion for impacting real lives through technology. Lifespark is one of the most promising startups in the Assistive Tech space in India, and has been honoured with several National and International awards. Our mission is to create seamless, persistent and affordable healthcare solutions. If you are someone who is driven to make a real impact in this world, we are your people.
Lifespark is currently building solutions for Parkinson’s Disease, and we are looking for a ML lead to join our growing team. You will be working directly with the founders on high impact problems in the Neurology domain. You will be solving some of the most fundamental and exciting challenges in the industry and will have the ability to see your insights turned into real products every day
Essential experience and requirements:
1. Advanced knowledge in the domains of computer vision, deep learning
2. Solid understand of Statistical / Computational concepts like Hypothesis Testing, Statistical Inference, Design of Experiments and production level ML system design
3. Experienced with proper project workflow
4. Good at collating multiple datasets (potentially from different sources)
5. Good understanding of setting up production level data pipelines
6. Ability to independently develop and deploy ML systems to various platforms (local and cloud)
7. Fundamentally strong with time-series data analysis, cleaning, featurization and visualisation
8. Fundamental understanding of model and system explainability
9. Proactive at constantly unlearning and relearning
10. Documentation ninja - can understand others documentation as well as create good documentation
Responsibilities :
1. Develop and deploy ML based systems built upon healthcare data in the Neurological domain
2. Maintain deployed systems and upgrade them through online learning
3. Develop and deploy advanced online data pipelines
About SafeGold
Gold is the most trusted asset across the entire world and one of the largest asset classes in India. The total traded value of gold in India exceeds $300 billion annually – nearly all of it in an unorganised, offline manner. We, at SafeGold, are building the digital infrastructure to organise the gold market using technology. SafeGold is India’s largest digital gold platform with 35 million customers and 100+ distribution partners across India, Thailand and UAE. The focus has always been on revenues and profitability growth which has been widely recognised.
#9 in 2023 ET Growth Champions (India)
#21 in 2023 FT High Growth Companies (Asia)
With revenues of more than Rs. 4,500 crs in the year ending March-23, we have been part of the Financial Times rankings of the fastest growing startups in the Asia-Pacific region since 2021 till date.
SafeGold is backed by the World Gold Council and leading venture capital funds, Beenext and Pravega.
Job Description
We’re a small team with insanely large ambitions. We are looking for a Software Engineer who will help shape Safe Gold’s product and marketing strategies by processing, analyzing and interpreting data sets. You will work closely with the tech, product and business teams for designing and developing a scalable application architecture, based on business requirements. The ideal candidate will have proficiency in
PHP and Relational as well as NoSql Databases (MySQL,MongoDB). You will have strong problem-solving skills and independent self-direction to thrive in our work environment.
What will your job entail:
● Implement a robust set of services and APIs to power the web application.
● Build reusable code and libraries for future use.
● Optimize the application for maximum speed and scalability.
● Implement security and data protection as per industry standards.
● Monitor the front-end and back-end aspects of the web application
● Coordinate with the product team to understand the requirements and formulate a detailed technical documentation.
● Assist junior developers if required.
Key Requirements:
● 4-7 years of experience in software development, at least 1 year of experience in a product based startup is preferable
● Strong understanding of MVC Architecture
● Atleast 2-3 years of working experience in PHP is a must have and proficiency in Laravel Framework will be a plus
● Proficient knowledge of both Relational as well as NoSql Databases (MySQL,MongoDB) is a must have
● Working knowledge of Python and Django/Flask Framework is good to have
● Familiarity with advanced JavaScript libraries and frameworks such as AngularJS, ReactJS
● Strong understanding of fundamental design principles to build a scalable application.
● Experience with implementing automated testing platforms and unit tests.
● Proficient understanding of code versioning tools (Git).
● At least 2 years of experience working on AWS cloud platform, using tools like EC2,RDS,Lambda, SQS, Elasticache
About UpSolve
We built and deliver complex AI solutions which help drive business decisions faster and more accurately. We are a typical AI company and have a range of solutions developed on Video, Image and Text.
What you will do
- Stay informed on new technologies and implement cautiously
- Maintain necessary documentation for the project
- Fix the issues reported by application users
- Plan, build, and design solutions with a mental note of future requirements
- Coordinate with the development team to manage fixes, code changes, and merging
Location: Mumbai
Working Mode: Remote
What are we looking for
- Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field.
- Minimum 2 years of professional experience in software development, with a focus on machine learning and full stack development.
- Strong proficiency in Python programming language and its machine learning libraries such as TensorFlow, PyTorch, or scikit-learn.
- Experience in developing and deploying machine learning models in production environments.
- Proficiency in web development technologies including HTML, CSS, JavaScript, and front-end frameworks such as React, Angular, or Vue.js.
- Experience in designing and developing RESTful APIs and backend services using frameworks like Flask or Django.
- Knowledge of databases and SQL for data storage and retrieval.
- Familiarity with version control systems such as Git.
- Strong problem-solving and analytical skills.
- Excellent communication and collaboration abilities.
- Ability to work effectively in a fast-paced and dynamic team environment.
- Good to have Cloud Exposure
Design, implement, and improve the analytics platform
Implement and simplify self-service data query and analysis capabilities of the BI platform
Develop and improve the current BI architecture, emphasizing data security, data quality
and timeliness, scalability, and extensibility
Deploy and use various big data technologies and run pilots to design low latency
data architectures at scale
Collaborate with business analysts, data scientists, product managers, software development engineers,
and other BI teams to develop, implement, and validate KPIs, statistical analyses, data profiling, prediction,
forecasting, clustering, and machine learning algorithms
Educational
At Ganit we are building an elite team, ergo we are seeking candidates who possess the
following backgrounds:
7+ years relevant experience
Expert level skills writing and optimizing complex SQL
Knowledge of data warehousing concepts
Experience in data mining, profiling, and analysis
Experience with complex data modelling, ETL design, and using large databases
in a business environment
Proficiency with Linux command line and systems administration
Experience with languages like Python/Java/Scala
Experience with Big Data technologies such as Hive/Spark
Proven ability to develop unconventional solutions, sees opportunities to
innovate and leads the way
Good experience of working in cloud platforms like AWS, GCP & Azure. Having worked on
projects involving creation of data lake or data warehouse
Excellent verbal and written communication.
Proven interpersonal skills and ability to convey key insights from complex analyses in
summarized business terms. Ability to effectively communicate with multiple teams
Good to have
AWS/GCP/Azure Data Engineer Certification
Role : Principal Devops Engineer
About the Client
It is a Product base company that has to build a platform using AI and ML technology for their transportation and logiticsThey also have a presence in the global market
Responsibilities and Requirements
• Experience in designing and maintaining high volume and scalable micro-services architecture on cloud infrastructure
• Knowledge in Linux/Unix Administration and Python/Shell Scripting
• Experience working with cloud platforms like AWS (EC2, ELB, S3, Auto-scaling, VPC, Lambda), GCP, Azure
• Knowledge in deployment automation, Continuous Integration and Continuous Deployment (Jenkins, Maven, Puppet, Chef, GitLab) and monitoring tools like Zabbix, Cloud Watch Monitoring, Nagios
• Knowledge of Java Virtual Machines, Apache Tomcat, Nginx, Apache Kafka, Microservices architecture, Caching mechanisms
• Experience in enterprise application development, maintenance and operations
• Knowledge of best practices and IT operations in an always-up, always-available service
• Excellent written and oral communication skills, judgment and decision-making skill