Cutshort logo

50+ Python Jobs in India

Apply to 50+ Python Jobs on CutShort.io. Find your next job, effortlessly. Browse Python Jobs and apply today!

icon
CallHub

at CallHub

4 candid answers
1 video
Eman Khan
Posted by Eman Khan
Bengaluru (Bangalore)
1 - 3 yrs
Upto ₹20L / yr (Varies
)
skill iconPython
skill iconAmazon Web Services (AWS)
Nginx
skill iconRedis
Memcached
+7 more

About us

CallHub provides cloud based communication software for nonprofits, political parties, advocacy organizations and businesses. We have delivered over 200 millions messages and calls for thousands of customers. We help political candidates during their campaigns in getting their message across to their voters, conduct surveys, manage event/town-hall invites and with recruiting volunteers for election campaigns. We are profitable with 8000+ paying customers from North America, Australia and Europe. Our customers include Uber, the Democratic Party, major political parties in the US, Canada, UK, France and Australia. 


About the Role

As a DevOps Engineer, you will play a crucial role in architecting, implementing, and managing the infrastructure and automation that powers our scalable and reliable cloud-based applications. Your expertise in Python, AWS, and various cloud services will be pivotal in driving our CI/CD pipelines, ensuring optimal performance, security, and availability of our services. Within the DevOps team, you will work closely with highly technical software engineers, quality engineers, support engineers, and product managers to build and maintain robust infrastructure solutions that support our rapidly growing customer base. We're looking for engineers with strong computer science fundamentals who are passionate, inquisitive and eager to learn new technologies and love working in a dynamic and fast-paced environment, contributing to our mission of delivering exceptional product experiences. Your


Responsibilities

  • Architect, implement, and manage scalable and secure infrastructure on AWS using services such as RDS, EC2, ELB, ASG, CloudWatch, and Lambda.
  • Automate deployment pipelines (CI/CD) to ensure seamless and reliable delivery of software to production.
  • Implement and maintain monitoring and alerting systems using CloudWatch and other tools to ensure system reliability and performance.
  • Optimize the performance and security of our applications using Cloudflare, Nginx, Redis, and CDNs.
  • Manage and optimize databases, including PostgreSQL, with a focus on indexing, query tuning, and performance optimization.
  • Act as a Site Reliability Engineer (SRE), being part of the on-call rotation to respond to and resolve critical incidents, ensuring high availability and minimal downtime.
  • Develop and implement strategies for incident management, root cause analysis, and post-incident reviews to continuously improve system reliability.
  • Collaborate with development teams to integrate DevOps best practices into the lifecycle of applications built with Python, Django, and Celery.
  • Deploy and manage message brokers and streaming platforms like RabbitMQ and Kafka.
  • Configure and manage proxy servers, reverse proxies, and load balancers to ensure optimal traffic management and security.
  • Troubleshoot and resolve infrastructure-related issues promptly.
  • Document processes, configurations, and best practices to ensure knowledge sharing and smooth operation.
  • Contribute to the continuous improvement of our DevOps practices and toolsets.
  • Communicate well with product and relevant stakeholders.


What we’re looking for

  • 1-3 years of experience in a DevOps or similar role, with a strong focus on Python scripting, AWS services, and infrastructure management experience.
  • Hands-on experience with AWS services such as EC2, RDS, ELB, ASG, CloudWatch, and Lambda.
  • Strong knowledge of CI/CD tools and practices, including automation using Jenkins, GitLab CI, or similar tools.
  • Experience with infrastructure as code (IaC) tools like Terraform, Pulumi, or CloudFormation.
  • Experience with Python-based frameworks like Django and task queues like Celery.
  • Proficiency in managing web servers (Nginx), caching solutions (Redis, Memcache), and CDNs.
  • Experience with relational databases (PostgreSQL) and messaging systems like RabbitMQ and Kafka.
  • Knowledge of database indexing, query tuning, and performance optimization techniques.
  • Solid understanding of proxy servers, reverse proxies, and load balancers.
  • Ability to troubleshoot complex issues across multiple layers of the stack.
  • Strong communication skills, with the ability to work effectively in a collaborative team environment.
  • Passionate about learning new technologies and improving existing processes.
  • BE/MS in Computer Science or a related field, or equivalent practical experience.


What you can look forward to

  • The opportunity to work on cutting-edge cloud technologies and contribute to mission-critical infrastructure.
  • A role that allows you to take ownership of significant aspects of our infrastructure and automation.
  • A collaborative and open culture where your ideas are valued, and you are encouraged to take initiative and aspire to be great in your role.
  • A dynamic work environment where your contributions directly impact the success and reliability of our services. You will get to see your work directly impacting in a significant way.
  • Exposure to the full lifecycle of software development and deployment, from design to monitoring and optimization.
Read more
NirwanaAI
HR Manager
Posted by HR Manager
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
3 - 15 yrs
₹12L - ₹50L / yr
skill iconJava
skill iconC++
skill iconC#
Online machine learning
skill iconPython



Job Overview:

We are seeking a motivated and enthusiastic Junior AI/ML Engineer to join our dynamic team. The ideal candidate will have a foundational knowledge in machine learning, deep learning, and related technologies, with hands-on experience in developing ML models from scratch. You will work closely with senior engineers and data scientists to design, implement, and optimize AI solutions that drive innovation and improve our products and services.


Key Responsibilities:

  • Develop and implement machine learning and deep learning models from scratch for various applications.
  • Collaborate with cross-functional teams to understand requirements and provide AI-driven solutions.
  • Utilize deep learning frameworks such as TensorFlow, PyTorch, Keras, and JAX for model development and experimentation.
  • Employ data manipulation and analysis tools such as pandas, scikit-learn, and statsmodels to preprocess and analyze data.
  • Apply visualization tools like matplotlib and spacy to present data insights and model performance.
  • Demonstrate a general understanding of data structures, algorithms, multi-threaded programming, and distributed computing concepts.
  • Leverage knowledge of statistical and algorithmic models along with fundamental mathematical concepts, including linear algebra and probability.


Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Data Science, Statistics, or a related field.
  • Solid foundation in machine learning, deep learning, computer vision, and natural language processing (NLP).
  • Proven experience in developing ML/deep learning models from scratch.
  • Proficiency in Python and relevant libraries.
  • Hands-on experience with deep learning frameworks such as TensorFlow, PyTorch, Keras, or JAX.
  • Experience with data manipulation and analysis libraries like pandas, scikit-learn, and visualization tools like matplotlib.
  • Strong understanding of data structures, algorithms, and multi-threaded programming.
  • Knowledge of statistical models and fundamental mathematical concepts, including linear algebra and probability.


Skills and Competencies:

  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration abilities.
  • Ability to work independently and as part of a team in a fast-paced environment.
  • Eagerness to learn and stay updated with the latest advancements in AI/ML technologies.


Preferred Qualifications:

  • Previous internship or project experience in AI/ML.
  • Familiarity with cloud-based AI/ML services and tools.


Nirwana.AI is an equal opportunity employer and welcomes applicants from all backgrounds to apply.


Read more
Datacultr Fintech
Shweta Jha
Posted by Shweta Jha
Gurugram
1 - 5 yrs
₹5L - ₹9L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+6 more

SOFTWARE DEVELOPER

 

ABOUT US

 

Datacultr is a global Digital Operating System for Risk Management and Credit Recovery, we drive Collection Efficiencies, and Reduces Delinquencies and Non-Performing Loans (NPLs). Datacultr is a Digital-Only provider of Consumer Engagement, Recovery, and Collection Solutions, helping Consumer Lending, Retail, Telecom, and Fintech Organizations to expand and grow their business in the under-penetrated New to Credit and Thin File Segments. Datacultr’s platforms, make the underserved and unbanked segment viable, for providers of Consumer Durable Loans, Buy Now Pay Later, Micro-Loans, Nano-Loans, and other Unsecured Loans.

 

We are helping millions of new to-credit consumers, across emerging markets, access formal credit and begin their journey towards financial health. We have clients across India, South Asia, South East Asia, Africa and LATAM.

 

Datacultr is headquartered in Dubai, with offices in Abu Dhabi, Singapore, Ho Chi Minh City, Nairobi, and Mexico City; with our Development Center based out of Gurugram, India.

 

 

ORGANIZATION’S GROWTH PLAN

 

Datacultr’s vision is to enable convenient financing opportunities for consumers, entrepreneurs, and small merchants, helping them combat the Socio-economic problems this segment faces due to restricted access to financing.

 

We are on a mission to enable 30 million unbanked & under-served people, to access financial services by 2025.

 


JOB DESCRIPTION

 

POSITION                                   –         Software Developer – L1/L2

ROLE                                           –         Individual Contributor

FUNCTION                                 –         Engineering

WORK LOCATION                    –         Gurugram

WORK MODEL                          –         Work from the Office only

QUALIFICATION                       –         B.Tech /M.Tech /B.C.A. /M.C.A.

SALARY PACKAGE                 –         Negotiable based on skillset & experience

NOTICE PERIOD                       –         Can join at the earliest

 

 

EXPECTATION

 

We are seeking a highly skilled and experienced Software Engineer with a minimum of 2 years of professional experience in Python and Django, specifically in building REST APIs using frameworks like FASTAPI and Django Rest Framework (DRF). The ideal candidate should have hands-on experience with Redis cache, Docker, containerization tools, and PostgreSQL.

 

 

KEY RESPONSIBILITIES

 

1.   Collaborate with cross-functional teams to design, develop, and maintain high-quality software solutions using Python, Django (including Django REST Framework), FastAPI, and other relevant frameworks.

2.   Build robust and scalable REST APIs, ensuring efficient data transfer and seamless integration with frontend and third-party systems.

3.   Utilize Redis for caching, session management, and performance optimization, and implement other caching strategies as needed.

4.   Containerize applications using Docker for easy deployment and scalability.

5.   Design and implement database schemas using PostgreSQL, ensuring data integrity and performance.

6.   Write clean, efficient, and well-documented code following best practices and coding standards.

7.   Participate in system design discussions and contribute to architectural decisions.

8.   Troubleshoot and debug complex software issues, ensuring smooth operation of the application.

9.   Profile and optimize Python code for improved performance and scalability.

10. Implement and maintain CI/CD pipelines for automated testing and deployment.



KEY REQUIREMENTS

 

·        2+ years of experience in Python backend development.

·        Strong proficiency in Python, Django, and RESTful API development.

·        Experience with FastAPI, asyncio, and other modern Python libraries and frameworks.

·        Solid understanding of database technologies, particularly PostgreSQL.

·        Proficiency in using Redis for caching and performance optimization.

·        Experience with Docker containerization and orchestration.

·        Knowledge of cloud platforms (AWS) and experience with related services (e.g., EC2, S3, RDS).

·        Familiarity with message brokers like RabbitMQ or Kafka.

·        Experience with Test-Driven Development (TDD) and automated testing frameworks.

·        Proficiency in version control systems, particularly Git.

·        Strong problem-solving skills and attention to detail.

·        Excellent communication skills and ability to work effectively in a collaborative environment.

·        Experience with Agile development methodologies.


PERKS & BENEFITS

 

v Professional Development through Learning & Up-skilling.

v Flexible working hours

v Medical benefits

v Exciting work culture

 


 

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Sukanya Mohan
Posted by Sukanya Mohan
Bengaluru (Bangalore), Mumbai
7 - 12 yrs
Best in industry
skill iconPython
skill iconDjango
skill iconFlask

Job Description: Python Backend Developer 

 

Experience: 7-12 years   

Job Type: Full-time   

 

Job Overview: 

Wissen Technology is looking for a highly experienced Python Backend Developer with 7-12 years of experience to join our team. The ideal candidate will have deep expertise in backend development using Python, with a strong focus on Django and Flask frameworks. 

Key Responsibilities: 

- Develop and maintain robust backend services and APIs using Python, Django, and Flask. 

- Design scalable and efficient database schemas, integrating with both relational and NoSQL databases. 

- Collaborate with front-end developers and other team members to establish objectives and design functional, cohesive code. 

- Optimize applications for maximum speed and scalability. 

- Ensure security and data protection protocols are implemented effectively. 

- Troubleshoot and debug applications to ensure a seamless user experience. 

- Participate in code reviews, testing, and quality assurance processes. 

  

Required Skills: 

Python: Extensive experience in backend development using Python. 

Django & Flask:  Proficiency in Django and Flask frameworks. 

Database Management: Strong knowledge of databases such as PostgreSQL, MySQL, and MongoDB. 

API Development: Expertise in building and maintaining RESTful APIs. 

Security: Understanding of security best practices and data protection measures. 

Version Control: Experience with Git for collaboration and version control. 

Problem-Solving: Strong analytical skills with a focus on writing clean, efficient code. 

Communication: Excellent communication and teamwork skills. 

  

Preferred Qualifications: 

- Experience with cloud services like AWS, Azure, or GCP. 

- Familiarity with Docker and containerization. 

- Knowledge of CI/CD practices. 

  

Why Join Wissen Technology? 

- Opportunity to work on innovative projects with a cutting-edge technology stack. 

- Competitive compensation and benefits package. 

- A supportive environment that fosters professional growth and learning. 

 

Read more
Planet Spark

at Planet Spark

5 recruiters
Mahwish  Khanam
Posted by Mahwish Khanam
Gurugram
1 - 3 yrs
Best in industry
SQL
Advanced analytics
MS-Excel
skill iconPython
PowerBI

PlanetSpark is Hiring !!


Title of the Job : Data Analyst ( FULL TIME)

Location : Gurgaon


Roles and Responsibilities/Mission Statement:


We are seeking an experienced Data Analyst to join our dynamic team. The ideal candidate will possess a strong analytical mindset, excellent problem-solving skills, and a passion for uncovering actionable insights from data. As a Data Analyst, you will be responsible for collecting, processing, and analyzing large datasets to help inform business decisions and strategies and will be the source of company wide intelligence


The responsibilities would include :


1) Creating a robust Sales MIS

2) Tracking key metrics of the company

3) Reporting key metrics on a daily basis

4) Sales incentive and teacher payout calculation

5) Tracking and analyzing large volume of consumer data related to customers and teachers

6) Developing intelligence from data from various sources


Ideal Candidate Profile -

- 1-4 years of experience in a data-intensive position at a consumer business or a Big 4 firm

- Excellent ability in advanced excel

- Knowledge of other data analytics tools and software such as SQL, Python, R, Excel, and data visualization tools will be good to have.

- Detail-oriented with strong organizational skills and the ability to manage multiple projects simultaneously.

- Exceptional analytical ability

- Detail-oriented with strong organizational skills and the ability to manage multiple projects simultaneously.


Eligibility Criteria:


- Willing to work 5 Days a week from office and Saturday - work from home

- Willing to work in an early-stage startup .

- Must have 1-3 years of prior experience in a data focused role at a consumer internet or a Big 4

- Must have excellent analytical abilities

- Available to Relocate to Gurgaon

- Candidate Should his own laptop

- Gurgaon based candidate will be given more preference


Join us and leverage your analytical expertise to drive data-driven decisions and contribute to our success. Apply today!



Read more
TVARIT GmbH
Shivani Kawade
Posted by Shivani Kawade
Pune
4 - 6 yrs
₹15L - ₹25L / yr
PyTorch
skill iconPython
Scikit-Learn
NumPy
pandas
+2 more

Who are we looking for?  


We are looking for a Senior Data Scientist, who will design and develop data-driven solutions using state-of-the-art methods. You should be someone with strong and proven experience in working on data-driven solutions. If you feel you’re enthusiastic about transforming business requirements into insightful data-driven solutions, you are welcome to join our fast-growing team to unlock your best potential.  

 

Job Summary 

  • Supporting company mission by understanding complex business problems through data-driven solutions. 
  • Designing and developing machine learning pipelines in Python and deploying them in AWS/GCP, ... 
  • Developing end-to-end ML production-ready solutions and visualizations. 
  • Analyse large sets of time-series industrial data from various sources, such as production systems, sensors, and databases to draw actionable insights and present them via custom dashboards. 
  • Communicating complex technical concepts and findings to non-technical stakeholders of the projects 
  • Implementing the prototypes using suitable statistical tools and artificial intelligence algorithms. 
  • Preparing high-quality research papers and participating in conferences to present and report experimental results and research findings. 
  • Carrying out research collaborating with internal and external teams and facilitating review of ML systems for innovative ideas to prototype new models. 

 

Qualification and experience 

  • B.Tech/Masters/Ph.D. in computer science, electrical engineering, mathematics, data science, and related fields. 
  • 5+ years of professional experience in the field of machine learning, and data science. 
  • Experience with large-scale Time-series data-based production code development is a plus. 

 

Skills and competencies 

  • Familiarity with Docker, and ML Libraries like PyTorch, sklearn, pandas, SQL, and Git is a must. 
  • Ability to work on multiple projects. Must have strong design and implementation skills. 
  • Ability to conduct research based on complex business problems. 
  • Strong presentation skills and the ability to collaborate in a multi-disciplinary team. 
  • Must have programming experience in Python. 
  • Excellent English communication skills, both written and verbal. 


Benefits and Perks

  • Culture of innovation, creativity, learning, and even failure, we believe in bringing out the best in you. 
  • Progressive leave policy for effective work-life balance. 
  • Get mentored by highly qualified internal resource groups and opportunity to avail industry-driven mentorship program, as we believe in empowering people.  
  • Multicultural peer groups and supportive workplace policies.  
  • Work from beaches, hills, mountains, and many more with the yearly workcation program; we believe in mixing elements of vacation and work. 


 Hiring Process 

  • Call with Talent Acquisition Team: After application screening, a first-level screening with the talent acquisition team to understand the candidate's goals and alignment with the job requirements. 
  • First Round: Technical round 1 to gauge your domain knowledge and functional expertise. 
  • Second Round: In-depth technical round and discussion about the departmental goals, your role, and expectations.
  • Final HR Round: Culture fit round and compensation discussions.
  • Offer: Congratulations you made it!  


If this position sparked your interest, apply now to initiate the screening process.

Read more
Nyteco

at Nyteco

2 candid answers
1 video
Alokha Raj
Posted by Alokha Raj
Remote only
2 - 3 yrs
₹4L - ₹6L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+7 more

About Davis Index


Davis Index is a market intelligence platform and publication that provides price benchmarks for recycled materials and primary metals.

Our team of dedicated reporters, analysts, and data specialists publish and process over 1,400 proprietary price indexes, metals futures prices, and other reference data including market intelligence, news, and analysis through an industry-leading technology platform.


About the role 

Here at Davis Index, we look to bring true, accurate market insights, news and data to the recycling industry. This enables sellers and buyers to boost their margins, and access daily market intelligence, data analytics, and news.


We’re looking for a keen data expert who will take on a high-impact role that focuses on end-to-end data management, BI and analysis tasks within a specific functional area or data type. If taking on challenges in building, extracting, refining and very importantly automating data processes is something you enjoy doing, apply to us now!


Key: Data integration, data migration, data warehouse automation, data synchronization, automated data extraction, or other data management projects.


What you will do in this role

  • Build and maintain data pipelines from internal databases.
  • Data mapping of data elements between source and target systems.
  • Create data documentation including mappings and quality thresholds.
  • Build and maintain analytical SQL/MongoDB queries, scripts.
  • Build and maintain Python scripts for data analysis/cleaning/structuring.
  • Build and maintain visualizations; delivering voluminous information in comprehensible forms or in ways that make it simple to recognise patterns, trends, and correlations.
  • Identify and develop data quality initiatives and opportunities for automation.
  • Investigate, track, and report data issues.
  • Undertake production data management functions as assigned/required.
  • Utilize various data workflow management and analysis tools.
  • Ability and desire to learn new processes, tools, and technologies.
  • Understanding fundamental AI and ML concepts.


Must have experience and qualifications

  • Bachelor's degree in Computer Science, Engineering, or Data related field required.
  • 2+ years’ experience in data management.
  • Advanced proficiency with Microsoft Excel and VBA/ Google sheets and AppScript
  • Proficiency with MongoDB/SQL.
  • Familiarity with Python for data manipulation and process automation preferred.
  • Proficiency with various data types and formats including, but not limited to JSON.
  • Intermediate proficiency with HTML/CSS.
  • Strong background in data analysis, data reporting, and data management coupled with the adept process mapping and improvements.
  • Strong research skills.
  • Attention to detail.


What you can expect

Work closely with a global team helping bring market intelligence to the recycling world. As a part of the Davis Index team we look to foster relationships and help you grow with us. You can also expect:

  • Work with leading minds from the recycling industry and be part of a growing, energetic global team
  • Exposure to developments and tools within your field ensures evolution in your career and skill building along with competitive compensation.
  • Health insurance coverage, paid vacation days and flexible work hours helping you maintain a work-life balance
  • Have the opportunity to network and collaborate in a diverse community



Read more
Hoomanely

at Hoomanely

5 candid answers
2 recruiters
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
6yrs+
Upto ₹50L / yr (Varies
)
skill iconPython
TensorFlow
PyTorch
skill iconMachine Learning (ML)

The ideal candidate will have a strong background in applying machine learning to Digital Signal Processing, particularly in audio signal analysis and accelerometer data analysis. This role involves developing innovative ML models, including CNNs, RNNs, and GANs, and leveraging techniques such as LLMs and low-rank adaptation to enhance our product offerings. The successful candidate will also be proficient in MLOps practices and deploying ML models into production environments. 

 

Key Responsibilities:

  • Custom ML Model Development: Design and build custom machine learning models from scratch, tailored to specific applications in digital and audio signal processing and accelerometer data analysis. The ideal candidate will have published papers or demonstrable customized models custom-built for specific problem domains.
  • Advanced ML Techniques: Apply advanced machine learning techniques, including time series analysis, CNNs, RNNs, GANs, LLMs, and low-rank adaptation, to solve complex problems in pet wellness technology.
  • Data Analysis and Processing: Perform sophisticated data analysis and preprocessing to prepare datasets for machine learning applications.
  • MLOps and Model Deployment: Implement MLOps practices to streamline the deployment of machine learning models into production, ensuring scalability, performance, and reliability.
  • Performance Optimization: Continuously monitor and optimize ML models to improve accuracy and efficiency.
  • Cross-functional Collaboration: Work closely with product development, engineering, and data science teams to integrate ML models into Hoomanely Inc.’s product ecosystem.
  • Research and Innovation: Stay abreast of the latest developments in machine learning and signal processing to drive innovation within the company.

 

Qualifications:

  • Bachelors/Masters in Engineering in Computer Science, Data Science, Electrical Engineering, or a related field focusing on machine learning. Preferably from a top tier (Tier 1 in India/US - IIT, NIT equivalent) institute.
  • Proven experience building custom ML models for digital signal processing, audio signal analysis, and accelerometer data analysis, with a minimum of 6 years of relevant experience.
  • Strong knowledge of time series-based machine learning, CNNs, RNNs, GANs, LLMs, and low-rank adaptation techniques.
  • Experience with MLOps practices and deploying machine learning models in production environments.
  • Proficiency in machine learning frameworks (e.g., TensorFlow, PyTorch) and programming languages (e.g., Python).
  • Excellent analytical, problem-solving, and communication skills.

 

Preferred Skills:

  • Experience in pet wellness or related industries.
  • Familiarity with IoT device data processing and analysis.
  • Knowledge of cloud computing platforms and services for ML deployment. 

 

Read more
Hoomanely

at Hoomanely

5 candid answers
2 recruiters
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
6yrs+
Upto ₹45L / yr (Varies
)
skill iconPython
skill iconGo Programming (Golang)
skill iconNodeJS (Node.js)
TensorFlow
PyTorch

Backend Engineer with a strong foundation in Machine Learning, proficient in Node.js, Go, and Python, and experienced in developing web crawlers and using machine learning frameworks like TensorFlow and PyTorch. The candidate will play a pivotal role in enhancing backend systems, contributing to various projects, and optimizing our technology for better user experiences and product functionality. The candidate should have experience with leading a small team of high-performing talented engineers.

 

Key Responsibilities:

  • Backend System Development: Design and implement scalable backend services using Node.js, Go, and Python.
  • Machine Learning Integration: Apply machine learning algorithms and frameworks, including TensorFlow and PyTorch, to enhance data analytics and insights.
  • Web Crawling and Data Scraping: Develop and maintain web crawlers and scrapers using tools like Scrapy to gather and process data efficiently.
  • DevOps and CI/CD Pipelines: Build and manage CI/CD pipelines for streamlined deployment and continuous integration.
  • Infrastructure Management: Oversee cloud and on-premises infrastructure, focusing on scalability and system reliability.
  • Collaboration and Agile Development: Work within cross-functional teams, aligning backend development with broader project objectives and adopting agile methodologies.
  • Agile Project Management: Independently build the engineering backlog, plan sprints, track sprints, measure sprint velocity, and burndown, and ensure timely product delivery.
  • Testing and Quality Assurance: Ensure the stability and performance of backend systems through comprehensive testing and quality control.

 

Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
  • Solid experience in backend development using Node.js, Go, and Python. A minimum of 6 years of experience in a high-scale consumer internet startup.
  • Experience with machine learning frameworks such as TensorFlow or PyTorch.
  • Proficiency in developing web crawlers and scrapers, with experience using Scrapy.
  • Strong understanding of cloud infrastructure, DevOps practices, and CI/CD pipelines.
  • Familiarity with handling large-scale data processing and storage.

 

Preferred Skills:

  • Exposure to IoT technologies and data integration from various sources.
  • Effective problem-solving skills and meticulous attention to detail. 
Read more
TVARIT GmbH
Shivani Kawade
Posted by Shivani Kawade
Remote, Pune
2 - 4 yrs
₹8L - ₹20L / yr
skill iconPython
PySpark
ETL
databricks
Azure
+6 more

TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes Tvarit one of the most innovative AI companies in Germany and Europe. 

 

 

We are looking for a self-motivated person with a positive "can-do" attitude and excellent oral and written communication skills in English. 

 

 

We are seeking a skilled and motivated Data Engineer from the manufacturing Industry with over two years of experience to join our team. As a data engineer, you will be responsible for designing, building, and maintaining the infrastructure required for the collection, storage, processing, and analysis of large and complex data sets. The ideal candidate will have a strong foundation in ETL pipelines and Python, with additional experience in Azure and Terraform being a plus. This role requires a proactive individual who can contribute to our data infrastructure and support our analytics and data science initiatives. 

 

 

Skills Required 

  • Experience in the manufacturing industry (metal industry is a plus)  
  • 2+ years of experience as a Data Engineer 
  • Experience in data cleaning & structuring and data manipulation 
  • ETL Pipelines: Proven experience in designing, building, and maintaining ETL pipelines. 
  • Python: Strong proficiency in Python programming for data manipulation, transformation, and automation. 
  • Experience in SQL and data structures  
  • Knowledge in big data technologies such as Spark, Flink, Hadoop, Apache and NoSQL databases. 
  • Knowledge of cloud technologies (at least one) such as AWS, Azure, and Google Cloud Platform. 
  • Proficient in data management and data governance  
  • Strong analytical and problem-solving skills. 
  • Excellent communication and teamwork abilities. 

 


Nice To Have 

  • Azure: Experience with Azure data services (e.g., Azure Data Factory, Azure Databricks, Azure SQL Database). 
  • Terraform: Knowledge of Terraform for infrastructure as code (IaC) to manage cloud. 


Read more
SmartWinnr
Hyderabad
4 - 5 yrs
Best in industry
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+3 more

MIS/ Data Analyst perform various functions, which primarily involve helping an organization meet its strategic goals through providing direction and guidance concerning information processes, security of data, and software application management.


However, the major tasks, duties, and responsibilities that commonly define the MIS/ Data Analyst job description are listed below:


  • Carry out analysis and assessment on the organization/clients existing software and database management systems, and information technology need to identify the shortcomings and proffer solutions to meet clients’ needs within time and cost constraints.
  • Responsible for the design and development of software and computer systems to meet the project needs within time and cost constraints.
  • Responsible for developing e-Commerce, database driven web sites, or other technology-driven solutions to improve the organization/client's processes using Windows and Java technologies.
  • Resolve end user hardware and software issues in a timely and professional manner.
  • Responsible for the development and maintenance of documentation and metrics to improve operational efficiency.
  • Carry out research and evaluate new technologies that can improve existing services.
  • Work together with other members of the MIS team to cross-train and provide backup to other team members.
  • Appraise and write software and database business systems analysis report on a framework, as well as update user knowledge for developed systems.
  • Perform system monitoring and analysis, and performance tuning to track system performance.
  • Responsible for troubleshooting system hardware, software, networks, and operating systems
  • Develop system access criteria, monitor, and control all system access; and implement security controls to secure computer system and ensure data integrity.
  • Provide IT support to the user community by interfacing with them.

Requirements:

  • Education: To work as an MIS analyst, applicants require a minimum of bachelor's degree in MIS, ISM, or Operations Research; Business Computer Science, Management Information Systems, or a field related to the employer’s industry.
  • Experience: Minimum 4 years of working experience in the same domain with advanced knowledge of MS Excel & tools like PowerBI, Tableau.
  • Knowledge: It is important that applicants have practical experience in business analytics or intelligence. They also require a working knowledge of a variety of data analysis techniques, including clustering, factor analysis, and logistic regression; data/text mining, decision trees, etc.
  • The MIS/ Data Analyst job requires that they undertake research to solve business problems, so it is vital that applicants can manipulate and extract insights from large and complex datasets and are proficient with SQL and tableau. It is also vital that they are comfortable with financial analysis to successfully perform some aspects of their job.
  • Computer skills: To succeed as an MIS/ Data Analyst, applicants must be advanced with Microsoft Tools, including Excel and possess advanced computer skills for data extraction, manipulation, and data analysis. Having practical experience with Power BI, Python etc.
  • Problem-solving skills: To perform their job successfully, the MIS/ Data Analyst must be curious individuals with analytical abilities to examine the current processes of an organization and come up with creative ways of solving them.
  • Communication skills: Their job requires that they communicate gaps to management or clients and train staff on how to use new or revised systems implemented. So, it is crucial that applicants are excellent communicators who can clarify expectations and drive alignment.
  • Collaborative skills: It is also important that applicants can work together with different teams across the organization with a commitment to deadlines and deliverables.


Read more
TVARIT GmbH
Shivani Kawade
Posted by Shivani Kawade
Remote, Pune
2 - 6 yrs
₹8L - ₹25L / yr
SQL Azure
databricks
skill iconPython
SQL
ETL
+9 more

TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes TVARIT one of the most innovative AI companies in Germany and Europe.


We are looking for a self-motivated person with a positive "can-do" attitude and excellent oral and written communication skills in English.


We are seeking a skilled and motivated senior Data Engineer from the manufacturing Industry with over four years of experience to join our team. The Senior Data Engineer will oversee the department’s data infrastructure, including developing a data model, integrating large amounts of data from different systems, building & enhancing a data lake-house & subsequent analytics environment, and writing scripts to facilitate data analysis. The ideal candidate will have a strong foundation in ETL pipelines and Python, with additional experience in Azure and Terraform being a plus. This role requires a proactive individual who can contribute to our data infrastructure and support our analytics and data science initiatives.


Skills Required:


  • Experience in the manufacturing industry (metal industry is a plus)
  • 4+ years of experience as a Data Engineer
  • Experience in data cleaning & structuring and data manipulation
  • Architect and optimize complex data pipelines, leading the design and implementation of scalable data infrastructure, and ensuring data quality and reliability at scale
  • ETL Pipelines: Proven experience in designing, building, and maintaining ETL pipelines.
  • Python: Strong proficiency in Python programming for data manipulation, transformation, and automation.
  • Experience in SQL and data structures
  • Knowledge in big data technologies such as Spark, Flink, Hadoop, Apache, and NoSQL databases.
  • Knowledge of cloud technologies (at least one) such as AWS, Azure, and Google Cloud Platform.
  • Proficient in data management and data governance
  • Strong analytical experience & skills that can extract actionable insights from raw data to help improve the business.
  • Strong analytical and problem-solving skills.
  • Excellent communication and teamwork abilities.


Nice To Have:

  • Azure: Experience with Azure data services (e.g., Azure Data Factory, Azure Databricks, Azure SQL Database).
  • Terraform: Knowledge of Terraform for infrastructure as code (IaC) to manage cloud.
  • Bachelor’s degree in computer science, Information Technology, Engineering, or a related field from top-tier Indian Institutes of Information Technology (IIITs).
  • Benefits And Perks
  • A culture that fosters innovation, creativity, continuous learning, and resilience
  • Progressive leave policy promoting work-life balance
  • Mentorship opportunities with highly qualified internal resources and industry-driven programs
  • Multicultural peer groups and supportive workplace policies
  • Annual workcation program allowing you to work from various scenic locations
  • Experience the unique environment of a dynamic start-up


Why should you join TVARIT ?


Working at TVARIT, a deep-tech German IT startup, offers a unique blend of innovation, collaboration, and growth opportunities. We seek individuals eager to adapt and thrive in a rapidly evolving environment.


If this opportunity excites you and aligns with your career aspirations, we encourage you to apply today!

Read more
Mahindra

Mahindra

Agency job
via Pluginlive by Harsha Saggi
Mumbai
0 - 2 yrs
₹0 - ₹30000 / mo
Fullstack Developer
skill iconPython

Job Description:

We are seeking a highly motivated and talented Full Stack Intern to join our dynamic team. The ideal candidate will have a strong foundation in Python, data, cloud technologies, and Java, and a keen interest in learning and applying these skills to real-world projects.

Responsibilities:

  • Collaborate with the development team to understand project requirements and design specifications.
  • Develop and maintain web applications using Python frameworks (e.g., Django, Flask).
  • Work with data to extract insights and build data-driven applications.
  • Explore and utilize cloud platforms (e.g., AWS, GCP, Azure) for application deployment and scaling.
  • Develop and test Java-based components as needed.
  • Learn and adapt to new technologies and programming languages.
  • Contribute to the overall success of the project by providing innovative ideas and solutions.

Qualifications:

  • Pursuing a Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
  • Strong programming skills in Python and Java.
  • Understanding of data structures, algorithms, and database systems.
  • Exposure to cloud computing concepts and platforms (AWS, GCP, Azure preferred).
  • Experience with front-end technologies (HTML, CSS, JavaScript) is a plus.
  • Ability to learn quickly and adapt to new technologies.
  • Strong problem-solving and analytical skills.
  • Excellent communication and teamwork skills.

Preferred Skills:

  • Experience with data analysis and visualization tools (e.g., Pandas, NumPy, Matplotlib).
  • Knowledge of SQL and NoSQL databases.
  • Familiarity with Agile development methodologies.
  • Contributions to open-source projects or personal projects.
  • This internship offers a unique opportunity to gain hands-on experience in a fast-paced and dynamic environment. You will have the chance to work on challenging projects, learn from experienced professionals, and contribute to the growth of our company.


Read more
Mahindra

Mahindra

Agency job
via Pluginlive by Harsha Saggi
Bengaluru (Bangalore)
0 - 2 yrs
₹0 - ₹30000 / mo
API
skill iconPython
SQL
skill iconData Science
skill iconMachine Learning (ML)

PluginLive Technologies: a recruitment tech company is hiring for Mahindra & Mahindra


Mahindra Group are looking forward for Data Analytics Interns who possess some of the following skills (at group technology office of 100+ Mahindra group of companies)


Role Name- Data Science Intern

Internship Duration: 6 months


Internship Location: Bangalore


Stipend: 30,000/month


Qualifications- BE/BTECH/ME/MTECH in Data analytics/Data science/AI -ML/ Computer science with specialization in AI-ML /AI-DS / DS with a good academic track record of min 60% & above in their 10th & 12th & min 60% above in their graduation without any arrears.


Role & Responsibilities:

  • A passion and aspiration to bring data driven transformation with accelerated career growth.
  • Technical expertise with data models, data mining, data migration, data architecture and segmentation techniques
  • Conceptualize, design and deliver data & application platforms as a whole and envision the overall data architecture for short- and long-term needs of business and technology point of view.
  • Have a sound knowledge of enterprise date architectural technology stacks (On Prem, cloud (GCP/Azure/AWS) and different building arch blocks (Snowflake, Data bricks, Lambda etc.)
  • Must have experience in modern day storage (blob, SQL & NoSQL - document, graph, columnar, key-value), processing (traditional, container based and serverless), real time vs batch processing for big data (volume, velocity and variety), OLTP & OLAP
  • Must have experience with industry standard DevOps tools, design patterns and practices.
  • Experience with MLOps is a plus.
  • Experience with designing horizontally scalable systems and applications (asynchronous, synchronous and socket based, dashboards)
  • Must be great at consulting and stakeholder management. The profile requires to have great communication and presentation skills.
  • Must be able to lay out and come up with a data & system architecture (high level and low level).


Read more
Smartavya

Smartavya

Agency job
via Pluginlive by Harsha Saggi
Mumbai
10 - 18 yrs
₹35L - ₹40L / yr
Hadoop
Architecture
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
PySpark
+13 more
  • Architectural Leadership:
  • Design and architect robust, scalable, and high-performance Hadoop solutions.
  • Define and implement data architecture strategies, standards, and processes.
  • Collaborate with senior leadership to align data strategies with business goals.
  • Technical Expertise:
  • Develop and maintain complex data processing systems using Hadoop and its ecosystem (HDFS, YARN, MapReduce, Hive, HBase, Pig, etc.).
  • Ensure optimal performance and scalability of Hadoop clusters.
  • Oversee the integration of Hadoop solutions with existing data systems and third-party applications.
  • Strategic Planning:
  • Develop long-term plans for data architecture, considering emerging technologies and future trends.
  • Evaluate and recommend new technologies and tools to enhance the Hadoop ecosystem.
  • Lead the adoption of big data best practices and methodologies.
  • Team Leadership and Collaboration:
  • Mentor and guide data engineers and developers, fostering a culture of continuous improvement.
  • Work closely with data scientists, analysts, and other stakeholders to understand requirements and deliver high-quality solutions.
  • Ensure effective communication and collaboration across all teams involved in data projects.
  • Project Management:
  • Lead large-scale data projects from inception to completion, ensuring timely delivery and high quality.
  • Manage project resources, budgets, and timelines effectively.
  • Monitor project progress and address any issues or risks promptly.
  • Data Governance and Security:
  • Implement robust data governance policies and procedures to ensure data quality and compliance.
  • Ensure data security and privacy by implementing appropriate measures and controls.
  • Conduct regular audits and reviews of data systems to ensure compliance with industry standards and regulations.
Read more
Optimum

Optimum

Agency job
via Pluginlive by Harsha Saggi
Bengaluru (Bangalore), Chennai
10 - 14 yrs
₹20L - ₹33L / yr
skill iconPython
SQL
Artificial Intelligence (AI)

Company: Optimum Solutions

About the company: Optimum solutions is a leader in a sheet metal industry, provides sheet metal solutions to sheet metal fabricators with a proven track record of reliable product delivery. Starting from tools through software, machines, we are one stop shop for all your technology needs.

Position: Generative AI Lead

Location: Chennai (Preference) and Bangalore

Minimum Qualification:  Bachelor's degree in computer science, Software Engineering, Data Science, or a related field.

Experience:   10-12 years

CTC:  33LPA

Employment Type:  Full Time

Key Responsibilities:

  1. Utilize advanced machine learning techniques to develop and train generative AI models.
  2. Collaborate with cross-functional teams to groom and preprocess large datasets for model training.
  3. Research and implement cutting-edge algorithms and architectures for generative AI applications.
  4. Optimize model performance and scalability for real-time inference and deployment.
  5. Stay current with industry trends and advancements in generative AI technology to drive innovation.
  6. Experiment with different hyperparameters and model configurations to improve generative AI model quality.
  7. Establish scalable, efficient, automated processes for data analyses, model development, validation and implementation,
  8. Choose suitable DL algorithms, software, hardware and suggest integration methods.
  9. Ensure AI ML solutions are developed, and validations are performed in accordance with Responsible AI guidelines & Standards
  10. To closely monitor the Model Performance and ensure Model Improvements are done post Project Delivery
  11. Provide technical expertise and guidance to support the integration of generative AI solutions into various products and services.
  12. Coach and mentor our team as we build scalable machine learning solutions
  13. Strong communication skills and an easy-going attitude
  14. Oversee development and implementation of assigned programs and guide teammates
  15. Carry out testing procedures to ensure systems are running smoothly
  16. Ensure that systems satisfy quality standards and procedures
  17. Build and manage strong relationships with stakeholders and various teams internally and externally,
  18. Provide direction and structure to assigned projects activities, establishing clear, precise goals, objectives and timeframes, run Project Governance calls with senior Stakeholders

Skills and Qualifications: 

  1. Strong understanding of machine learning and deep learning principles and algorithms.
  2. Experience in developing and implementing generative AI models and algorithms.
  3. Proficiency in programming languages such as Python, TensorFlow, and PyTorch.
  4. Ability to work with large datasets and knowledge of data preprocessing techniques.
  5. Familiarity with natural language processing (NLP) and computer vision for generative AI applications.
  6. Experience in building and deploying generative AI systems in real-world applications.
  7. Strong problem-solving and critical thinking skills for complex AI problems.
  8. Excellent communication and teamwork abilities to collaborate with cross-functional teams.
  9. Proven track record of delivering innovative solutions using generative AI technologies.
  10. Ability to stay updated with the latest advancements in generative AI and adapt to new techniques and methodologies.
  11. Application Development- Framework , Langchain, LlamaIndex , Vector DB , Pinecone/Chroma/Weaviate
  12. Fine Tuning Models-  Compute Service, Azure/Google Cloud/Lambda , Data Service, Scale/Labelbox, Hosting Services , Azure/AWS , ML Frameworks , Tensorflow,Pytorch
  13. Model Hubs-  Hugging Face , Databricks , 
  14. Foundation Models-  Open-Source , Mistral,Llama , Proprietary , GPT-4
  15. Compute Hardware- Specialized hardware for model training and inference,  Specialized hardware for model training and inference
  16. Programming Proficiency - 
  • Advanced Python Skills - Generative AI Experts should have a deep understanding of Python, including its data structures, OOP’s concepts, and libraries such as NumPy and Pandas. They must be able to write clean, efficient, and maintainable code to implement complex AI algorithms.
  • TensorFlow and Keras Expertise - TensorFlow and Keras are widely used in the AI community for building neural networks and deep learning models. Generative AI Experts should have a thorough understanding of these libraries, including how to design neural network architectures, customize loss functions, and optimize models for performance
  • Debugging and Optimization - Solving complicated problems is a common part of developing generative AI models. Experts must be adept in debugging methods, such as logging and profiling data to find and address problems quickly. They should also know how to optimize code for memory efficiency and performance, which will help the models manage large-scale datasets
  • Effective Data Management - One of the most frequent tasks in AI development is managing big datasets. Experts in generative AI should be adept at manipulating data with tools like Pandas and NumPy. To guarantee that the data they use for their models is of the highest caliber, they need also know how to efficiently preprocess and clean data.
  • Version Control and Collaboration - Git and other version control systems are crucial for tracking code changes and fostering developer collaboration in a team environment. To enable smooth cooperation on AI projects, generative AI Experts should be familiar with Git workflows, branching techniques, and handling merge conflicts

  17. Deep Learning Expertise - Neural Networks , Convolutional Neural Network , Recurrent Neural Network

  18. Knowledge of Generative Models - Transformers and Attention networks , Generative adversarial network

  19.Generative AI Basics and Advanced Concepts

  • Prompt Engineering - Crafting high-quality prompts is crucial for guiding generative models. Experts should excel in designing prompts that steer the model’s creativity and coherence. They must understand how to fine-tune prompts for tasks like text, image, and music generation.
  • Attention Mechanisms - Grasping attention mechanisms in models like Transformers, vital for capturing dependencies and context in generative tasks.
  • Application Development Approaches -Familiarity with integrating generative models into applications is essential. This includes deploying models in mobile apps, web applications, or as APIs. Experts should consider factors such as model size, latency, and scalability during deployment.
  • Fine-Tuning - Mastery of techniques like fine-tuning language models (e.g., GPT-3) for specific tasks. This involves adjusting model parameters and prompts to generate contextually relevant and accurate outputs.
  • RAG (Retrieval-Augmented Generation) - Understanding RAG, a framework that combines generative models with retrieval mechanisms. Experts can use RAG to improve model responses by retrieving relevant information from a large dataset. Proficiency in chaining multiple generative models together to create more complex and diverse outputs. This involves connecting models in a sequence to generate outputs that build upon each other.
  • Multimodal Generation - Ability to generate outputs across multiple modalities (e.g., text and images), requiring integration of different generative models.
Read more
Optimum

Optimum

Agency job
via Pluginlive by Harsha Saggi
Chennai, Bengaluru (Bangalore)
3 - 14 yrs
₹15L - ₹26L / yr
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconPython
SQL

Company: Optimum Solutions

About the company: Optimum solutions is a leader in a sheet metal industry, provides sheet metal solutions to sheet metal fabricators with a proven track record of reliable product delivery. Starting from tools through software, machines, we are one stop shop for all your technology needs.

Role Overview:

  • Creating and managing database schemas that represent and support business processes, Hands-on experience in any SQL queries and Database server wrt managing deployment.
  • Implementing automated testing platforms, unit tests, and CICD Pipeline
  • Proficient understanding of code versioning tools, such as GitHub, Bitbucket, ADO
  • Understanding of container platform, such as Docker

Job Description

  • We are looking for a good Python Developer with Knowledge of Machine learning and deep learning framework.
  • Your primary focus will be working the Product and Usecase delivery team to do various prompting for different Gen-AI use cases
  • You will be responsible for prompting and building use case Pipelines
  • Perform the Evaluation of all the Gen-AI features and Usecase pipeline

Position: AI ML Engineer

Location: Chennai (Preference) and Bangalore

Minimum Qualification:  Bachelor's degree in computer science, Software Engineering, Data Science, or a related field.

Experience:  4-6 years

CTC: 16.5 - 17 LPA

Employment Type:  Full Time

Key Responsibilities:

  • Take care of entire prompt life cycle like prompt design, prompt template creation, prompt tuning/optimization for various Gen-AI base models
  • Design and develop prompts suiting project needs
  • Lead and manage team of prompt engineers
  • Stakeholder management across business and domains as required for the projects
  • Evaluating base models and benchmarking performance
  • Implement prompt gaurdrails to prevent attacks like prompt injection, jail braking and prompt leaking
  •  Develop, deploy and maintain auto prompt solutions
  • Design and implement minimum design standards for every use case involving prompt engineering

Skills and Qualifications

  • Strong proficiency with Python, DJANGO framework and REGEX
  • Good understanding of Machine learning framework Pytorch and Tensorflow
  • Knowledge of Generative AI and RAG Pipeline
  • Good in microservice design pattern and developing scalable application.
  • Ability to build and consume REST API
  • Fine tune and perform code optimization for better performance.
  • Strong understanding on OOP and design thinking
  • Understanding the nature of asynchronous programming and its quirks and workarounds
  • Good understanding of server-side templating languages
  • Understanding accessibility and security compliance, user authentication and authorization between multiple systems, servers, and environments
  • Integration of APIs, multiple data sources and databases into one system
  • Good knowledge in API Gateways and proxies, such as WSO2, KONG, nginx, Apache HTTP Server.
  • Understanding fundamental design principles behind a scalable and distributed application
  • Good working knowledge on Microservices architecture, behaviour, dependencies, scalability etc.
  • Experience in deploying on Cloud platform like Azure or AWS
  • Familiar and working experience with DevOps tools like Azure DEVOPS, Ansible, Jenkins, Terraform
Read more
Zethic Technologies

at Zethic Technologies

1 recruiter
Pooja G
Posted by Pooja G
Bengaluru (Bangalore)
2 - 4 yrs
₹12L - ₹15L / yr
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconMongoDB
+4 more

Zethic Technologies is one of the leading creative tech studio based in Bangalore. Zethic’s team members have years of experience in software development. Zethic specializes in Custom software development, Mobile Applications development, chatbot development, web application development, UI/UX designing, and consulting.

Your Responsibilities:

  1. Coordinating with the software development team in addressing technical doubts
  2. Reviewing ongoing operations and rectifying any issues
  3. Work closely with the developers to determine and implement appropriate design and code changes, and make relevant recommendations to the team
  4. Very good leadership skills with the ability to lead multiple development teams
  5. Ability to learn new technologies rapidly and share knowledge with other team members.
  6. Provide technical leadership to programmers working on the development project team.
  7. Must have knowledge of stages in SDLC
  8. Should be informed on designing the overall architecture of the web application. Should have experience working with graphic designers and converting designs to visual elements.
  9. Highly experienced with back-end programming languages (PHP, Python, JavaScript). Proficient experience using advanced JavaScript libraries and frameworks such as ReactJS.
  10. Development experience for both mobile and desktop. Knowledge of code versioning tools (GIT)
  11. Mentors junior web developers on technical issues and modern web development best practices and solutions
  12. Developing reusable code for continued use

Why join us?

We’re multiplying and the sky’s the limit
Work with a talented team you’ll learn a lot from them
We care about delivering value to our excellent customers
We are flexible in our opinions and always open to new ideas
We know it takes people with different ideas, strengths, backgrounds, cultures, beliefs, and interests to make our Company succeed.
We celebrate and respect all our employees equally.


Zethic ensures equal employment opportunity without discrimination or harassment based on race, color, religion, sex, gender identity, age, disability, national origin, marital status, genetic information, veteran status, or any other characteristic protected by law.

Read more
CorpCare

CorpCare

Agency job
via Pluginlive by Harsha Saggi
Mumbai
2 - 7 yrs
₹8L - ₹12L / yr
skill iconPython
skill iconDjango
skill iconFlask
FastAPI
skill iconAmazon Web Services (AWS)
+3 more

Company Description

CorpCare is India’s first all-in-one corporate funds and assets management platform based in Mumbai. We offer a single window solution for corporates, family offices, and HNIs to formulate and manage treasury management policies. Our portfolio management system provide assistance in conducting reviews with investment committees and the board. 


Role Description 

  • Role- Python Developer 
  • CTC- Upto 12 LPA 


This is a full-time on-site role for a Python Developer located in Mumbai. The Python Developer will be responsible for back-end web development, software development, object-oriented programming (OOP), programming, and databases. The Python Developer will also be responsible for performing system analysis and creating robust and scalable software solutions. 


Qualifications 

  • 2+ years of work experience with Python (Programming Language)
  • Expertise in Back-End Web Development • Proficiency in Software Development specially in Django framework, Fast API, Rest APIs, AWS
  • Experience in Programming and Databases
  • Understanding of Agile development methodologies 
  • Excellent problem-solving and analytical skills
  • Ability to work in a team environment
  • Bachelor's or Master's degree in Computer Science or relevant field
  • Relevant certifications in Python and related frameworks are preferred
Read more
Smartavya

Smartavya

Agency job
via Pluginlive by Harsha Saggi
Mumbai
12 - 16 yrs
₹32L - ₹35L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+12 more
  • Engage with client business team managers and leaders independently to understand their requirements, help them structure their needs into data needs, prepare functional and technical specifications for execution and ensure delivery from the data team. This can be combination of ETL Processes, Reporting Tools, Analytics tools like SAS, R and alike.
  • Lead and manage the Business Analytics team, ensuring effective execution of projects and initiatives.
  • Develop and implement analytics strategies to support business objectives and drive data-driven decision-making.
  • Analyze complex data sets to provide actionable insights that improve business performance.
  • Collaborate with other departments to identify opportunities for process improvements and implement data-driven solutions.
  • Oversee the development, maintenance, and enhancement of dashboards, reports, and analytical tools.
  • Stay updated with the latest industry trends and technologies in analytics and data
  • science.
Read more
Cargill Business Services
Vignesh R
Posted by Vignesh R
Bengaluru (Bangalore)
4 - 7 yrs
Best in industry
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Windows Azure
+2 more

Job Purpose and Impact

The DevOps Engineer is a key position to strengthen the security automation capabilities which have been identified as a critical area for growth and specialization within Global IT’s scope. As part of the Cyber Intelligence Operation’s DevOps Team, you will be helping shape our automation efforts by building, maintaining and supporting our security infrastructure.

Key Accountabilities

  • Collaborate with internal and external partners to understand and evaluate business requirements.
  • Implement modern engineering practices to ensure product quality.
  • Provide designs, prototypes and implementations incorporating software engineering best practices, tools and monitoring according to industry standards.
  • Write well-designed, testable and efficient code using full-stack engineering capability.
  • Integrate software components into a fully functional software system.
  • Independently solve moderately complex issues with minimal supervision, while escalating more complex issues to appropriate staff.
  • Proficiency in at least one configuration management or orchestration tool, such as Ansible.
  • Experience with cloud monitoring and logging services.

Qualifications

Minimum Qualifications

  • Bachelor's degree in a related field or equivalent exp
  • Knowledge of public cloud services & application programming interfaces
  • Working exp with continuous integration and delivery practices

Preferred Qualifications

  • 3-5 years of relevant exp whether in IT, IS, or software development
  • Exp in:
  •  Code repositories such as Git
  • Scripting languages (Python & PowerShell)
  • Using Windows, Linux, Unix, and mobile platforms within cloud services such as AWS
  • Cloud infrastructure as a service (IaaS) / platform as a service (PaaS), microservices, Docker containers, Kubernetes, Terraform, Jenkins
  • Databases such as Postgres, SQL, Elastic
Read more
Bengaluru (Bangalore)
2 - 10 yrs
Best in industry
skill iconPython
Natural Language Processing (NLP)
Generative AI
skill iconChatGPT
Azure
+2 more

Job Purpose and Impact:


The Sr. Generative AI Engineer will architect, design and develop new and existing GenAI solutions for the organization. As a Generative AI Engineer, you will be responsible for developing and implementing products using cutting-edge generative AI and RAG to solve complex problems and drive innovation across our organization. You will work closely with data scientists, software engineers, and product managers to design, build, and deploy AI-powered solutions that enhance our products and services in Cargill. You will bring order to ambiguous scenarios and apply in depth and broad knowledge of architectural, engineering and security practices to ensure your solutions are scalable, resilient and robust and will share knowledge on modern practices and technologies to the shared engineering community.


Key Accountabilities:


• Apply software and AI engineering patterns and principles to design, develop, test, integrate, maintain and troubleshoot complex and varied Generative AI software solutions and incorporate security practices in newly developed and maintained applications.

• Collaborate with cross-functional teams to define AI project requirements and objectives, ensuring alignment with overall business goals.

• Conduct research to stay up-to-date with the latest advancements in generative AI, machine learning, and deep learning techniques and identify opportunities to integrate them into our products and services, optimizing existing generative AI models and RAG for improved performance, scalability, and efficiency, developing and maintaining pipelines and RAG solutions including data preprocessing, prompt engineering, benchmarking and fine-tuning.

• Develop clear and concise documentation, including technical specifications, user guides and presentations, to communicate complex AI concepts to both technical and non-technical stakeholders.

• Participate in the engineering community by maintaining and sharing relevant technical approaches and modern skills in AI.

• Contribute to the establishment of best practices and standards for generative AI development within the organization.

• Independently handle complex issues with minimal supervision, while escalating only the most complex issues to appropriate staff.


Minimum Qualifications:


• Bachelor’s degree in a related field or equivalent experience

• Minimum of five years of related work experience

• You are proficient in Python and have experience with machine learning libraries and frameworks

• Have deep understanding of industry leading Foundation Model capabilities and its application.

• You are familiar with cloud-based Generative AI platforms and services

• Full stack software engineering experience to build products using Foundation Models

• Confirmed experience architecting applications, databases, services or integrations.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore)
3 - 5 yrs
₹4L - ₹25L / yr
skill iconGo Programming (Golang)
skill iconRuby on Rails (ROR)
skill iconRuby
skill iconPython
skill iconJava
+2 more

Job Description:


● 3+ years of experience as a Data engineer or related role.

● 3+ years of experience in application development using Python

● Strong experience with SQL and good to have NoSQL.

● Experience with Agile engineering practices.

● Preferred experience in writing queries for RDBMS, cloud-based data warehousing solutions like

Snowflake.

● Ability to work independently or as part of a team.

● Experience with cloud platforms, preferably AWS, is good to have

● Experience with ETL/LT tools and methodologies.

● Experience working on real-time Data Streaming and Data Streaming platforms


About Wissen Technology:

 

·       The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015.

·       Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains. We help clients build world class products.

·       Our workforce consists of 550+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.

·       Wissen Technology has grown its revenues by 400% in these five years without any external funding or investments.

·       Globally present with offices US, India, UK, Australia, Mexico, and Canada.

·       We offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.

·       Wissen Technology has been certified as a Great Place to Work®.

·       Wissen Technology has been voted as the Top 20 AI/ML vendor by CIO Insider in 2020.

·       Over the years, Wissen Group has successfully delivered $650 million worth of projects for more than 20 of the Fortune 500 companies.

·       We have served clients across sectors like Banking, Telecom, Healthcare, Manufacturing, and Energy. They include Morgan Stanley, MSCI, StateStreet, Flipkart, Swiggy, Trafigura, GE to name a few.

 

 

Website : www.wissen.com 

Read more
Bengaluru (Bangalore)
3 - 10 yrs
₹10L - ₹35L / yr
Linux/Unix
cicd
DevOps
Ansible
skill iconPython
+1 more

We are looking for multiple hands-on software engineers to handle CI/CD build and packaging engineering to facilitate RtBrick Full Stack (RBFS) software packages for deployment on various hardware platforms. You will be part of a high-performance team responsible for platform and infrastructure.


Requirements

 

1. About 3-10 years of industry experience in Linux administration with an emphasis on automation

2. Experience with CI/CD tooling framework and cloud deployments

3. Experience With Software Development Tools like Git, Gitlab, Jenkins, Cmake, GNU build tools & Ansible

4. Proficient in Python and Shell scripting. Experience with Go-lang is excellent to have

5. Experience with Linux "Apt" Package Management, Web server, optional Open Network Linux (ONL), infrastructure like boot, pxe, IPMI, APC

6. Experience with Open Networking Linux (ONL) is highly desirable. SONIC build experience will be a plus.


Responsibilities

 

CI/CD- Packaging

Knowledge of compilation, packaging and repository usage in various flavors of Linux.

Expertise in Linux system administration and internals is essential. Ability to build custom images with container, Virtual Machine environment, modify bootloader, reduce image and optimize containers for low power consumption.


Linux Administration

Install and configure Linux systems, including back-end database and scripts, perform system maintenance by reviewing error logs, create systems backup, and build Linux modules and packages for software deployment. Build packages in Open Network Linux and SONIC distributions in the near future.

 

Read more
Cargill Business Services
Sindhu Dagennavar
Posted by Sindhu Dagennavar
Bengaluru (Bangalore)
15 - 18 yrs
₹35L - ₹40L / yr
SAP ERP
SAP BASIS
Fiori
UI5
SAP ABAP
+1 more

Job Purpose and Impact:


The Enterprise Resource Planning (ERP) Engineering Supervisor will lead a small engineering team across technology and business capabilities to build and enhance modern business applications for ERP systems in the company. In this role, you will guide team in product development, architecture and technology adherence to ensure delivered solutions are secure and scalable. You will also lead team development and cross team relationships and delivery to advance the company's engineering delivery.


Key Accountabilities:


  • Lead a team of engineering professionals that design, develop, deploy and enhance the new and existing software solutions.
  • Provide direction to the team to build highly scalable and resilient software products and platforms to support business needs.
  • Provide input and guidance to the delivery team across technology and business capabilities to accomplish team deliverables.
  • Provide support to software engineers dedicated to products in other portfolios within ERP teams.
  • Partner with the engineering community to coach engineers, share relevant technical approaches, identify new trends, modern skills and present code methodologies.


Qualifications:


MINIMUM QUALIFICATIONS:


  • Bachelor’s degree in a related field or equivalent experience
  • Minimum of four years of related work experience


PREFERRED QUALIFICATIONS:


  • Confirmed hands on technical experience with technologies including cloud, software development and continuous integration and continuous delivery
  • 2 years of supervisory experience
  • Experience leading engineers in the area of ERP basis, Code Development (ABAP, HTML 5, Python, Java, etc..), or Design Thinking.


Read more
Vola Finance

at Vola Finance

1 video
5 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
3 - 5 yrs
₹20L - ₹25L / yr
skill iconAmazon Web Services (AWS)
Data engineering
Spark
SQL
Data Warehouse (DWH)
+4 more

Roles & Responsibilities


Basic Qualifications:

● The position requires a four-year degree from an accredited college or university.

● Three years of data engineering / AWS Architecture and security experience.


Top candidates will also have:

Proven/Strong understanding and/or experience in many of the following:-

● Experience designing Scalable AWS architecture.

● Ability to create modern data pipelines and data processing using AWS PAAS components (Glue, etc.) or open source tools (Spark, Hbase, Hive, etc.).

● Ability to develop SQL structures that support high volumes and scalability using

RDBMS such as SQL Server, MySQL, Aurora, etc.

● Ability to model and design modern data structures, SQL/NoSQL databases, Data Lakes, Cloud Data Warehouse

● Experience in creating Network Architecture for secured scalable solution.

● Experience with Message brokers such as Kinesis, Kafka, Rabbitmq, AWS SQS, AWS SNS, and Apache ActiveMQ. Hands-on experience on AWS serverless architectures such as Glue,Lamda, Redshift etc.

● Working knowledge of Load balancers, AWS shield, AWS guard, VPC, Subnets, Network gateway Route53 etc.

● Knowledge of building Disaster management systems and security logs notification system

● Knowledge of building scalable microservice architectures with AWS.

● To create a framework for monthly security checks and wide knowledge on AWS services

● Deploying software using CI/CD tools such CircleCI, Jenkins, etc.

● ML/ AI model deployment and production maintainanace experience is mandatory.

● Experience with API tools such as REST, Swagger, Postman and Assertible.

● Versioning management tools such as github, bitbucket, GitLab.

● Debugging and maintaining software in Linux or Unix platforms.

● Test driven development

● Experience building transactional databases.

● Python, PySpark programming experience .

● Must experience engineering solutions in AWS.

● Working AWS experience, AWS certification is required prior to hiring

● Working in Agile Framework/Kanban Framework

● Must demonstrate solid knowledge of computer science fundamentals like data structures & algorithms.

● Passion for technology and an eagerness to contribute to a team-oriented environment.

● Demonstrated leadership on medium to large-scale projects impacting strategic priorities.

● Bachelor’s degree in Computer science or Electrical engineering or related field is required

Read more
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Indrajeet Deshmukh
Posted by Indrajeet Deshmukh
Pune
3 - 6 yrs
Best in industry
skill iconKubernetes
skill iconGit
MySQL
skill iconAmazon Web Services (AWS)
CI/CD
+3 more

With a core belief that advertising technology can measurably improve the lives of patients, DeepIntent is leading the healthcare advertising industry into the future. Built purposefully for the healthcare industry, the DeepIntent Healthcare Advertising Platform is proven to drive higher audience quality and script performance with patented technology and the industry’s most comprehensive health data. DeepIntent is trusted by 600+ pharmaceutical brands and all the leading healthcare agencies to reach the most relevant healthcare provider and patient audiences across all channels and devices. For more information, visit DeepIntent.com or find us on LinkedIn.


We are seeking a skilled and experienced Site Reliability Engineer (SRE) to join our dynamic team. The ideal candidate will have a minimum of 3 years of hands-on experience in managing and maintaining production systems, with a focus on reliability, scalability, and performance. As an SRE at Deepintent, you will play a crucial role in ensuring the stability and efficiency of our infrastructure, as well as contributing to the development of automation and monitoring tools.


Responsibilities:

  • Deploy, configure, and maintain Kubernetes clusters for our microservices architecture.
  • Utilize Git and Helm for version control and deployment management.
  • Implement and manage monitoring solutions using Prometheus and Grafana.
  • Work on continuous integration and continuous deployment (CI/CD) pipelines.
  • Containerize applications using Docker and manage orchestration.
  • Manage and optimize AWS services, including but not limited to EC2, S3, RDS, and AWS CDN.
  • Maintain and optimize MySQL databases, Airflow, and Redis instances.
  • Write automation scripts in Bash or Python for system administration tasks.
  • Perform Linux administration tasks and troubleshoot system issues.
  • Utilize Ansible and Terraform for configuration management and infrastructure as code.
  • Demonstrate knowledge of networking and load-balancing principles.
  • Collaborate with development teams to ensure applications meet reliability and performance standards.


Additional Skills (Good to Know):

  • Familiarity with ClickHouse and Druid for data storage and analytics.
  • Experience with Jenkins for continuous integration.
  • Basic understanding of Google Cloud Platform (GCP) and data center operations.


Qualifications:

  • Minimum 3 years of experience in a Site Reliability Engineer role or similar.
  • Proven experience with Kubernetes, Git, Helm, Prometheus, Grafana, CI/CD, Docker, and microservices architecture.
  • Strong knowledge of AWS services, MySQL, Airflow, Redis, AWS CDN.
  • Proficient in scripting languages such as Bash or Python.
  • Hands-on experience with Linux administration.
  • Familiarity with Ansible and Terraform for infrastructure management.
  • Understanding of networking principles and load balancing.


Education:

Bachelor's degree in Computer Science, Information Technology, or a related field.


DeepIntent is committed to bringing together individuals from different backgrounds and perspectives. We strive to create an inclusive environment where everyone can thrive, feel a sense of belonging, and do great work together.

DeepIntent is an Equal Opportunity Employer, providing equal employment and advancement opportunities to all individuals. We recruit, hire and promote into all job levels the most qualified applicants without regard to race, color, creed, national origin, religion, sex (including pregnancy, childbirth and related medical conditions), parental status, age, disability, genetic information, citizenship status, veteran status, gender identity or expression, transgender status, sexual orientation, marital, family or partnership status, political affiliation or activities, military service, immigration status, or any other status protected under applicable federal, state and local laws. If you have a disability or special need that requires accommodation, please let us know in advance.

DeepIntent’s commitment to providing equal employment opportunities extends to all aspects of employment, including job assignment, compensation, discipline and access to benefits and training.

Read more
Eclat Engineering Pvt Ltd
Remote only
2 - 4 yrs
₹6L - ₹8L / yr
Software Testing (QA)
Test Automation (QA)
Appium
Selenium
skill iconJava
+6 more

About The Role

To design, implement, and execute testing procedures for our software applications. In this role, the candidate will be instrumental in driving our software quality assurance lifecycle, collaborating with development teams to establish test strategies, and developing automated tests to uphold our stringent quality benchmarks, thereby reducing manual regression efforts.


By integrating tests into the CI/CD pipeline, the candidate will ensure that software releases are reliable and of high quality. Additionally, the candidate will troubleshoot and diagnose issues in systems under test, contributing to the continuous improvement of the software development process.


What Describes You Best

  • Minimum Bachelor's degree in Computer Science, Engineering, or a related discipline.
  • 2 to 3 years experience in Automation Testing.
  • Experience of working on SAAS /enterprise products is preferred.


Technical Skills: (must have)

  • Sound understanding of SDLC processes and the QA lifecycle and methodology
  • Hands-on experience with test Automation tools and frameworks such as Selenium WebDriver (with Java), Cucumber, Appium, or TestNG
  • Proven experience in test automation using Java scripting language
  • Strong Understanding of DOM
  • Good experience with continuous integration/continuous deployment (CI/CD) concepts and tools like Jenkins or GitLab CI.
  • Hands-on experience with any of the bug tracking and test management tools (e.g. GitLab, Jira, Jenkins, Bugzilla, etc.)
  • Experience with API testing (Postman or Similar RESTClient)


Additional Skills: (nice to have)

  • Knowledge of performance testing tools such as JMeter
  • Knowledge of Serenity BDD Framework
  • Knowledge of Python programming language


What will you Own

The key accountability of the candidate will be to maintain and enhance the QA automation process (along with CI/CD/CT), create/update test suites, write documentation, and ensure quality delivery of our software components by automation testing and also contribute to manual testing when required. Furthermore, enhance the product by utilizing automation scripting in solution development and improving processes/workflows.


How will you spend your time at Eclat

QA and Documentation

  • Sketching out ideas for automated software test procedures.
  • Enhancing, Optimizing, and maintaining automated CI/CD/CT workflows.
  • Write, design, execute, and maintain automation scripts for web and mobile platforms.
  • Maximizing test coverage for the most critical features of the application to reduce manual testing effort and quick regression.
  • Reviewing software bug reports, maintaining reporting of automation test suites, and highlighting problem areas.
  • Manage and Troubleshooting issues in systems under test.
  • Establishing and coordinating test strategies with development/product teams.
  • Manage documentation repositories and version control systems.

Post-delivery participation - Training and User Feedback

  • Participating in user feedback sessions to identify and understand user persona and requirements.
  • Working closely with the support team in providing necessary product technical support.

Why Join Us

  • Be a part of our growth story as we aim to take a leadership position in international markets
  • Opportunity to manage and lead global teams and channel partner network
  • Join technology innovators who believe in solving world-scale challenges to drive global knowledge-sharing
  • Healthy work/life balance, offering wellbeing initiatives, parental leave, career development assistance, required work infrastructure support


Read more
Solar Secure
Saurabh Singh
Posted by Saurabh Singh
Remote only
0 - 1 yrs
₹8000 - ₹10000 / mo
skill iconData Science
skill iconPython
Jupyter Notebook

About the Company :

Nextgen Ai Technologies is at the forefront of innovation in artificial intelligence, specializing in developing cutting-edge AI solutions that transform industries. We are committed to pushing the boundaries of AI technology to solve complex challenges and drive business success.


Currently offering "Data Science Internship" for 2 months.


Data Science Projects details In which Intern’s Will Work :

Project 01 : Image Caption Generator Project in Python

Project 02 : Credit Card Fraud Detection Project

Project 03 : Movie Recommendation System

Project 04 : Customer Segmentation

Project 05 : Brain Tumor Detection with Data Science


Eligibility


A PC or Laptop with decent internet speed.

Good understanding of English language.

Any Graduate with a desire to become a web developer. Freshers are welcomed.

Knowledge of HTML, CSS and JavaScript is a plus but NOT mandatory.

Fresher are welcomed. You will get proper training also, so don't hesitate to apply if you don't have any coding background.


#please note that THIS IS AN INTERNSHIP , NOT A JOB.


We recruit permanent employees from inside our interns only (if needed).


Duration : 02 Months 

MODE: Work From Home (Online)


Responsibilities


Manage reports and sales leads in salesforce.com, CRM.

Develop content, manage design, and user access to SharePoint sites for customers and employees.

Build data driven reports, store procedures, query optimization using SQL and PL/SQL knowledge.

Learned the essentials to C++ and Java to refine code and build the exterior layer of web pages.

Configure and load xml data for the BVT tests.

Set up a GitHub page.

Develop spark scripts by using Scala shell as per requirements.

Develop and A/B test improvements to business survey questions on iOS.

Deploy statistical models to various company data streams using Linux shells.

Create monthly performance-base client billing reports using MySQL and NoSQL databases.

Utilize Hadoop and MapReduce to generate dynamic queries and extract data from HDFS.

Create source code utilizing JavaScript and PHP language to make web pages functional.

Excellent problem-solving skills and the ability to work independently or as part of a team.

Effective communication skills to convey complex technical concepts.


Benefits


Internship Certificate

Letter of recommendation

Stipend Performance Based

Part time work from home (2-3 Hrs per day)

5 days a week, Fully Flexible Shift


Read more
NetSquare Solutions
Aishwarya M
Posted by Aishwarya M
Remote only
3 - 15 yrs
Best in industry
Ansible
CI/CD
gitlab
skill iconJenkins
Bash
+1 more

We are seeking a skilled DevOps Engineer with 3+ years of experience to join our team on a permanent work-from-home basis.


Responsibilities:

  • Develop and maintain infrastructure using Ansible.
  • Write Ansible playbooks.
  • Implement CI/CD pipelines.
  • Manage GitLab repositories.
  • Monitor and troubleshoot infrastructure issues.
  • Ensure security and compliance.
  • Document best practices.


Qualifications:

  • Proven DevOps experience.
  • Expertise with Ansible and CI/CD pipelines.
  • Proficient with GitLab.
  • Strong scripting skills.
  • Excellent problem-solving and communication skills.


Regards,

Aishwarya M

Associate HR

Read more
Noida
4yrs+
₹20L - ₹30L / yr
skill iconPython
CI/CD
skill iconJenkins
DevOps
Robot Framework
+1 more

About Us

Prismberry is a leading provider of software & automation services to diverse industries. We specializes in Software Development and IT Services, with expertise in be-spoke automation & cloud-based software solutions for diverse industries. We are dedicated to delivering

innovative solutions that transform businesses.


Key responsibilities

  • Build a Python Automation Suite for hardware platforms used in servers.
  • Build and compose tests for firmware and system software components.
  • Build and manage suites of automated regression tests.
  • Work as an independent contributor.


Key skills and experience required

  • Outstanding debugging and programming abilities in Python
  • Excellent understanding of CI/CD and DevOps principles; ability to create pipelines with Jenkins and open-source tools
  • Strong familiarity with Robot Framework & Bash scripting
  • Outstanding oral and writing communication abilities, a strong work ethic, and a strong sense of cooperation.
  • Practical experience interacting closely with microcontroller & microprocessors
  • Good to have knowledge of test frameworks such as PyTest, Unit-test, Flask, and others for Python development.
  • It would be beneficial to have knowledge of cloud computing platforms like AWS, or GCP.
  • It is favored to have a basic understanding of the tools used to implement Kubernetes, containers, or virtualization technologies.
  • Must have practical expertise any Version Control System such as GIT.


Why Prismberry

  • Competitive salary and performance-based incentives
  • Opportunities for career growth and advancement
  • Collaborative and innovative work environment
  • Cutting-edge technology solutions
  • Strong commitment to employee development and well-being
Read more
TVARIT GmbH
Pune
8 - 15 yrs
₹20L - ₹25L / yr
skill iconPython
CI/CD
Systems Development Life Cycle (SDLC)
ETL
JIRA
+5 more

TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes TVARIT one of the most innovative AI companies in Germany and Europe.



Requirements:

  • Python Experience: Minimum 3+ years.
  • Software Development Experience: Minimum 8+ years.
  • Data Engineering and ETL Workloads: Minimum 2+ years.
  • Familiarity with Software Development Life Cycle (SDLC).
  • CI/CD Pipeline Development: Experience in developing CI/CD pipelines for large projects.
  • Agile Framework & Sprint Methodology: Experience with Jira.
  • Source Version Control: Experience with GitHub or similar SVC.
  • Team Leadership: Experience leading a team of software developers/data scientists.

Good to Have:

  • Experience with Golang.
  • DevOps/Cloud Experience (preferably AWS).
  • Experience with React and TypeScript.

Responsibilities:

  • Mentor and train a team of data scientists and software developers.
  • Lead and guide the team in best practices for software development and data engineering.
  • Develop and implement CI/CD pipelines.
  • Ensure adherence to Agile methodologies and participate in sprint planning and execution.
  • Collaborate with the team to ensure the successful delivery of projects.
  • Provide on-site support and training in Pune.

Skills and Attributes:

  • Strong leadership and mentorship abilities.
  • Excellent problem-solving skills.
  • Effective communication and teamwork.
  • Ability to work in a fast-paced environment.
  • Passionate about technology and continuous learning.


Note: This is a part-time position paid on an hourly basis. The initial commitment is 4-8 hours per week, with potential fluctuations.


Join TVARIT and be a pivotal part of shaping the future of software development and data engineering.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore)
7 - 12 yrs
₹4L - ₹30L / yr
skill iconPython
SQL
snowflake

● 5+ years of experience as a Data engineer or related role.

● 5+ years of experience in application development using Python

● Strong experience with SQL and good to have NoSQL.

● Experience with Agile engineering practices.

● Preferred experience in writing queries for RDBMS, cloud-based data warehousing solutions like

Snowflake.

● Ability to work independently or as part of a team.

● Experience with cloud platforms, preferably AWS, is good to have

● Experience with ETL/LT tools and methodologies.

● Experience working on real-time Data Streaming and Data Streaming platforms

Read more
Optimo Capital

at Optimo Capital

2 candid answers
Ajinkya Pokharkar
Posted by Ajinkya Pokharkar
Bengaluru (Bangalore)
1 - 3 yrs
₹5.5L - ₹12L / yr
skill iconPython
skill iconData Analytics
credit risk
Portfolio management
Underwriting
+2 more

About Us:

Optimo Capital is a newly established NBFC founded by Prashant Pitti, who is also a co-founder of EaseMyTrip (a billion-dollar listed startup that grew profitably without any funding).


Our mission is to serve the underserved MSME businesses with their credit needs in India. With less than 15% of MSMEs having access to formal credit, we aim to bridge this credit gap through a phygital model (physical branches + digital decision-making).


As a technology and data-first company, tech lovers and data enthusiasts play a crucial role in building the analytics & tech at Optimo that helps the company thrive.


What We Offer:

Join our dynamic startup team as a Senior Data Analyst and play a crucial role in core data analytics projects involving credit risk, lending strategy, credit underwriting features analytics, collections, and portfolio management. The analytics team at Optimo works closely with the Credit & Risk departments, helping them make data-backed decisions.


This is an exceptional opportunity to learn, grow, and make a significant impact in a fast-paced startup environment. We believe that the freedom and accountability to make decisions in analytics and technology bring out the best in you and help us build the best for the company. This environment offers you a steep learning curve and an opportunity to experience the direct impact of your analytics contributions. Along with this, we offer industry-standard compensation.


What We Look For:

We are looking for individuals with a strong analytical mindset and a fundamental understanding of the lending industry, primarily focused on credit risk. We value not only your skills but also your attitude and hunger to learn, grow, lead, and thrive, both individually and as part of a team. We encourage you to take on challenges, bring in new ideas, implement them, and build the best analytics systems. Your willingness to put in the extra hours to build the best will be recognized.


Skills/Requirements:

  • Credit Risk & Underwriting: Fundamental knowledge of credit risk and underwriting processes is mandatory. Experience in any lending financial institution is a must. A thorough understanding of all the features evaluated in the underwriting process like credit report info, bank statements, GST data, demographics, etc., is essential.
  • Analytics (Python): Excellent proficiency in Python - Pandas and Numpy. A strong analytical mindset and the ability to extract actionable insights from any analysis are crucial. The ability to convert the given problem statements into actionable analytics tasks and frame effective approaches to tackle them is highly desirable.
  • Good to have but not mandatory: REST APIs: A fundamental understanding of APIs and previous experience or projects related to API development or integrations. Git: Proficiency in version control systems, particularly Git. Experience in collaborative projects using Git is highly valued.


What You'll Be Working On:

  • Analyze data from different data sources, extract information, and create action items to tackle the given open-ended problems.
  • Build strong analytics systems and dashboards that provide easy access to data and insights, including the current status of the company, portfolio health, static pool, branch-wise performance, TAT (turnaround time) monitoring, and more.
  • Assist the credit and risk team with insights and action items, helping them make data-backed decisions and fine-tune the credit policy (high involvement in the credit and underwriting process).
  • Work on different rule engines that automate the underwriting process end-to-end.


Other Requirements:

  • Availability for full-time work in Bangalore. Immediate joiners are preferred.
  • Strong passion for analytics and problem-solving.
  • At least 1 year of industry experience in an analytics role, specifically in a lending institution, is a must.
  • Self-motivated and capable of working both independently and collaboratively.


If you are ready to embark on an exciting journey of growth, learning, and innovation, apply now to join our pioneering team in Bangalore.

Read more
Janapriya Educational society
Miyapur, Hyderabad
0 - 5 yrs
₹2.5L - ₹3.5L / yr
skill iconPython
skill iconJavascript
skill iconC++
skill iconHTML/CSS
Scratch
+1 more

Hi,

I am HR from Janapriya school , Miyapur , Hyderabad , Telangana.

Currently we are looking for a primary computer teacher .

the teacher should have atleast 2 years experience in teaching computers .

Intrested candidates can apply to the above posting.

Read more
Incubyte

at Incubyte

7 recruiters
Gouthami Vallabhaneni
Posted by Gouthami Vallabhaneni
Remote only
1 - 3 yrs
₹8L - ₹18L / yr
Data Transformation Tool (DBT)
SQL
Windows Azure
MySQL
ETL
+6 more

Who are we?

 

We are incubators of high-quality, dedicated software engineering teams for our clients. We work with product organizations to help them scale or modernize their legacy technology solutions. We work with startups to help them operationalize their idea efficiently. Incubyte strives to find people who are passionate about coding, learning, and growing along with us. We work with a limited number of clients at a time on dedicated, long term commitments with an aim of bringing a product mindset into services.

 

What we are looking for

 

We’re looking to hire software craftspeople. People who are proud of the way they work and the code they write. People who believe in and are evangelists of extreme programming principles. High quality, motivated and passionate people who make great teams. We heavily believe in being a DevOps organization, where developers own the entire release cycle and thus get to work not only on programming languages but also on infrastructure technologies in the cloud.

 

What you’ll be doing

 

First, you will be writing tests. You’ll be writing self-explanatory, clean code. Your code will produce the same, predictable results, over and over again. You’ll be making frequent, small releases. You’ll be working in pairs. You’ll be doing peer code reviews.

 

You will work in a product team. Building products and rapidly rolling out new features and fixes.

 

You will be responsible for all aspects of development – from understanding requirements, writing stories, analyzing the technical approach to writing test cases, development, deployment, and fixes. You will own the entire stack from the front end to the back end to the infrastructure and DevOps pipelines. And, most importantly, you’ll be making a pledge that you’ll never stop learning!

 

Skills you need in order to succeed in this role


Most Important: Integrity of character, diligence and the commitment to do your best

Must Have: SQL, Databricks, (Scala / Pyspark), Azure Data Factory, Test Driven Development

Nice to Have: SSIS, Power BI, Kafka, Data Modeling, Data Warehousing

 

Self-Learner: You must be extremely hands-on and obsessive about delivering clean code

 

  • Sense of Ownership: Do whatever it takes to meet development timelines
  • Experience in creating end to end data pipeline
  • Experience in Azure Data Factory (ADF) creating multiple pipelines and activities using Azure for full and incremental data loads into Azure Data Lake Store and Azure SQL DW
  • Working experience in Databricks
  • Strong in BI/DW/Datalake Architecture, design and ETL
  • Strong in Requirement Analysis, Data Analysis, Data Modeling capabilities
  • Experience in object-oriented programming, data structures, algorithms and software engineering
  • Experience working in Agile and Extreme Programming methodologies in a continuous deployment environment.
  • Interest in mastering technologies like, relational DBMS, TDD, CI tools like Azure devops, complexity analysis and performance
  • Working knowledge of server configuration / deployment
  • Experience using source control and bug tracking systems,

   writing user stories and technical documentation

  • Strong in Requirement Analysis, Data Analysis, Data Modeling capabilities
  • Expertise in creating tables, procedures, functions, triggers, indexes, views, joins and optimization of complex
  • Experience with database versioning, backups, restores and
  • Expertise in data security and
  • Ability to perform database performance tuning queries
Read more
FiftyFive Technologies Pvt Ltd
Shreshtha Bhatnagar
Posted by Shreshtha Bhatnagar
Remote only
5 - 15 yrs
₹20L - ₹30L / yr
skill iconC++
skill iconPython
skill iconC
Linux/Unix
Red Hat Linux
+4 more

Malware Researcher


We are seeking a highly talented Malware Researcher with a strong background in sandboxing and

reverse engineering techniques.


Skills

● Excellent written and verbal communication skills.

● Understanding of security principles and best practices.

● Must have a solid understanding of macOS and iOS Malware.

● Strong capabilities of dynamic/static Malware analysis.

● Can improve research with entropy methods.

● Encryption/obfuscation proficiency.


Description

● Research new malware online.

● Detecting trends and waves of infections.

● Writing technical articles on findings.

● Code reading to determine if a particular file is malicious or not.

● Writing regular expressions to detect and remove malicious code.

● Act as a strong advocate for quality in the product development process for software

engineering.

● Self-motivated and able to grasp issues quickly and make educated, critical judgments

in the absence of complete requirements.

● Able to multitask in a dynamic, fast-paced environment, proven social skills, and worked

with various multi-functional teams to deliver high-quality products.


Qualification

● 5+ years of direct experience in macOS and iOS environments surrounding threats.

● Bachelor's degree in Information Security or a related technical field.

● Intel and ARM architecture knowledge.

● macOS and iOS coding experience is a plus

Read more
Truminds Software Systems
Sonali Pandey
Posted by Sonali Pandey
Hyderabad
2 - 3 yrs
₹5L - ₹7L / yr
skill iconC
skill iconC++
Linux/Unix
skill iconPython
skill iconGit
+1 more

Mandatory Skills

  • C/C++ Programming
  • Linux System concepts
  • Good Written and verbal communication skills
  • Good problem-solving skills
  • Python scripting experience
  • Prior experience in Continuous Integration and Build System is a plus
  • SCM tools like git, perforce etc is a plus
  • Repo, Git and Gerrit tools
  • Android Build system expertise
  • Automation development experience with like Electric Commander, Jenkins, Hudson


Read more
LambdaTest

at LambdaTest

3 recruiters
Madhuri Sandur
Posted by Madhuri Sandur
Noida
2 - 5 yrs
₹10L - ₹30L / yr
Software Testing (QA)
Test Automation (QA)
Appium
Selenium
CI/CD
+1 more

About LambdaTest

We are the only cloud-based testing platform aimed at bringing the whole testing ecosystem to the cloud. Today we have a powerful cloud network of 2000+ real browsers and operating systems that helps testers in cross-browser and cross-platform compatibility testing.


Our Mission


On a mission to provide the most powerful, cutting-edge, comprehensive, and secure cloud test platform to empower software testers & developers globally to perform testing intelligently at scale.


Our Vision


We envision an integrated platform where professionals can rely to perform and manage all types of tests without being limited by infrastructure dependency. So people could focus on things that matter the most, i.e. their tests.



 What You'll Do

  • Ability to create tools, microsite, DevOps, and technical solutions for testing.
  • Experience in Object-Oriented Analysis, Design(OOAD), and development of software using UML Methodology, good knowledge of J2EE design patterns and Core Java design patterns.
  • Analyze test logs; create test reports, co-ordinate with stakeholders
  • Experience in web application and device test automation using Selenium, Robotium, Appium, or any equivalent tool/s.
  • Strong experience with Agile development incorporating Continuous Integration and Continuous Delivery, utilizing technologies such as GIT, Maven, Jenkins, Chef, Sonar, PowerMock.
  •  Design and build scalable automated test frameworks and test suites working across technologies.
  • Debugging of any issue faced.
  • GoLang, Docker, Kubernetes experience is good to have
  • Perform manual testing, the scope of which will encompass all functionalities of services as a prequel to automation
  • Experience working closely with development and business teams to communicate impacts and to understand business requirements


What you should have 


  • A Bachelor's or Master's degree with 2 – 6 years of experience as a Developer or SDET.
  • Comfortable communicating cross-functionally and across management levels in formal and informal settings.
  • Ability to effectively articulate technical challenges and solutions.
  • Shows creativity and initiative to improve product coverage and effectiveness.
  • Ability to work in teams.
  • Deal well with ambiguous/undefined problems; ability to think abstractly.
  • Go-getter attitude.


Read more
TIFIN FINTECH

at TIFIN FINTECH

1 recruiter
Vrishali Mishra
Posted by Vrishali Mishra
Bengaluru (Bangalore)
10 - 15 yrs
Best in industry
skill iconPython
skill iconReact.js

WHO WE ARE:

 

TIFIN is a fintech platform backed by industry leaders including JP Morgan, Morningstar, Broadridge, Hamilton Lane, Franklin Templeton, Motive Partners and a who’s who of the financial service industry. We are creating engaging wealth experiences to better financial lives through AI and investment intelligence powered personalization. We are working to change the world of wealth in ways that personalization has changed the world of movies, music and more but with the added responsibility of delivering better wealth outcomes.

 

We use design and behavioral thinking to enable engaging experiences through software and application programming interfaces (APIs). We use investment science and intelligence to build algorithmic engines inside the software and APIs to enable better investor outcomes.

 

In a world where every individual is unique, we match them to financial advice and investments with a recognition of their distinct needs and goals across our investment marketplace and our advice and planning divisions.

OUR VALUES: Go with your GUT

 

  • Grow at the Edge. We are driven by personal growth. We get out of our comfort zone and keep egos aside to find our genius zones. With self-awareness and integrity we strive to be the best we can possibly be. No excuses.
  • Understanding through Listening and Speaking the Truth. We value transparency. We communicate with radical candor, authenticity and precision to create a shared understanding. We challenge, but once a decision is made, commit fully.
  • I Win for Teamwin. We believe in staying within our genius zones to succeed and we take full ownership of our work. We inspire each other with our energy and attitude. We fly in formation to win together.

 

Responsibilities:

 

  • Develop user-facing features such as web apps and landing portals.
  • Ensure the feasibility of UI/UX designs and implement them technically.
  • Create reusable code and libraries for future use.
  • Optimize applications for speed and scalability.
  • Contribute to the entire implementation process, including defining improvements based on business needs and architectural enhancements.
  • Promote coding, testing, and deployment of best practices through research and demonstration.
  • Review frameworks and design principles for suitability in the project context.
  • Demonstrate the ability to identify opportunities, lay out rational plans, and see them through to completion.

 

Requirements:

 

  • Bachelor’s degree in Engineering with 10+ years of software product development experience.
  • Proficiency in React, Django, Pandas, GitHub, AWS, and JavaScript, Python
  • Strong knowledge of PostgreSQL, MongoDB, and designing REST APIs.
  • Experience with scalable interactive web applications.
  • Understanding of software design constructs and implementation.
  • Familiarity with ORM libraries and Test-Driven Development.
  • Exposure to the Finance domain is preferred.
  • Knowledge of HTML5, LESS/CSS3, jQuery, and Bootstrap.
  • Expertise in JavaScript fundamentals and front-end/back-end technologies.

 

Nice to Have:

 

  • Strong knowledge of website security and common vulnerabilities.
  • Exposure to financial capital markets and instruments.

 

Compensation and Benefits Package:

 

  • Competitive compensation with a discretionary annual bonus.
  • Performance-linked variable compensation.
  • Medical insurance.

 

A note on location. While we have team centers in Boulder, New York City, San Francisco, Charlotte, and Mumbai, this role is based out of Bangalore

 

TIFIN is an equal-opportunity workplace, and we value diversity in our workforce. All qualified applicants will receive consideration for employment without regard to any discrimination.

 

Read more
PeerXP

at PeerXP

1 recruiter
amritha
Posted by amritha
Bengaluru (Bangalore)
1 - 1 yrs
₹1L - ₹1.5L / yr
skill iconGo Programming (Golang)
skill iconRuby on Rails (ROR)
skill iconRuby
skill iconPython
skill iconJava
+4 more

Requirements

  • Design and implement a full-stack web application using Python Django framework and ReactJS.
  • 1+ years of experience in building and deploying web applications.
  • Experience in designing and using RESTful APIs.
  • Basic knowledge in front-end technologies such as JavaScript, , ReactJS, HTML5, and CSS3.
  • Understanding of fundamental design principles behind a scalable application.
  • Understanding of databases, SQL and non-relational, plus the Django ORM.
  • Strong proficiency in JavaScript, including DOM manipulation and the JavaScript object model.
  • Experience in web mark-up like HTML and CSS.
  • Experience with data structure libraries.
  • Familiarity with RESTful APIs.
  • A knack for bench marking and optimization.
  • Knowledge of modern authorization mechanisms, such as JSON Web Token.
  • Familiarity with modern front-end build pipelines and tools.
  • Experience with common front-end development tools such as NPM etc.
  • Strong knowledge of Git version control.
  • Experience deploying Python applications into production.
  • Amazon Web Services (AWS) knowledge is a plus.


Responsibilities

  • Writing reusable, testable, and efficient code.
  • Design and implementation of low-latency, high-availability, and performance applications using Django framework in Python.
  • Create and use the REST APIs for communicating with other apps.
  • Assess and prioritize feature requests and work in Agile framework.
  • Implementation of security and data protection algorithms.
  • Integration of data storage solutions like databases, key-value stores, blob stores, S3 etc.
  • Improve the functionality of existing systems and applications.
  • High commitment to work and taking ownership of deliverables.
  • Help to research and influence our path forward with strategic technology initiatives.


Read more
Zweeny Pvt Ltd
Preeti Harshavardhan
Posted by Preeti Harshavardhan
Hyderabad
5 - 8 yrs
₹24L - ₹40L / yr
skill iconPython

The Technical Lead will oversee all aspects of application development at TinyPal. This position involves both managing the development team and actively contributing to the coding and architecture, particularly in the backend development using Python. The ideal candidate will bring a strategic perspective to the development process, ensuring that our solutions are robust, scalable, and aligned with our business goals.



Key Responsibilities:

  • Lead and manage the application development team across all areas, including backend, frontend, and mobile app development.
  • Hands-on development and oversight of backend systems using Python, ensuring high performance, scalability, and integration with frontend services.
  • Architect and design innovative solutions that meet market needs and are aligned with the company’s technology strategy, with a strong focus on embedding AI technologies to enhance app functionalities.
  • Coordinate with product managers and other stakeholders to translate business needs into technical strategies, particularly in leveraging AI to solve complex problems and improve user experiences.
  • Maintain high standards of software quality by establishing good practices and habits within the development team.
  • Evaluate and incorporate new technologies and tools to improve application development processes, with a particular emphasis on AI and machine learning technologies.
  • Mentor and support team members to foster a collaborative and productive environment.
  • Lead the deployment and continuous integration of applications across various platforms, ensuring AI components are well integrated and perform optimally.


Required Skills and Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field.
  • Minimum of 7 years of experience in software development, with at least 1 year in a leadership role.
  • Expert proficiency in Python and experience with frameworks like Django or Flask.
  • Broad experience in full lifecycle development of large-scale applications.
  • Strong architectural understanding of both frontend and backend technologies, with a specific capability in integrating AI into complex systems.
  • Experience with cloud platforms (AWS, Azure, Google Cloud), and understanding of DevOps and CI/CD processes.
  • Demonstrated ability to think strategically about business, product, and technical challenges, including the adoption and implementation of AI solutions.
  • Excellent team management, communication, and interpersonal skills.



Read more
Pace Stock Broking Services
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
6 - 12 yrs
₹25L - ₹40L / yr
skill iconReact.js
skill iconGo Programming (Golang)
skill iconFlutter
skill iconJava
skill iconPython
+4 more

 

Experience Level: Minimum 5 years

About Pace

Started in 1995 by first-generation entrepreneurs from IIMA & FMS Delhi, PACE has evolved from a fledgling NSE Broker to a premier boutique financial conglomerate over the last 25 years. Headquartered in New Delhi, we maintain offices at more than 300 locations in more than 75 cities across India, and our customer base is spread over 34 countries. We have also been consistently nominated as one of the best Investment Advisors in India by ICRA & CNBC. At PACE we are continuously innovating and building highly scalable backend systems and strategies that give a seamless experience to our customers. We are aggressively pursuing Fintech innovation now and working on the ambitious and potentially disruptive Fintech product ‘Pocketful’—a one-of-a-kind stock-broking platform.


About Pocketful (Fintech Division of Pace)

Founded by IIM-Ahmedabad, Yale, and Columbia alumni, Pocketful is a new-age Fintech broking platform, aimed at making financial markets accessible for all. We're constantly innovating and working on a disruptive platform. The team is highly skilled, young, and extremely hungry and we are looking for folks who fit this persona. We are backed by one of India's leading stock brokers Pace Stock Broking Services.

Overview:

  • We are seeking an experienced Engineering Manager or Tech Lead to join our dynamic team in the fintech industry, focusing on stockbroking solutions. The ideal candidate will have a strong technical background and leadership experience, with proficiency in our tech stacks: React.js, Flutter, Golang, and MongoDB.
  • Responsibilities:
  • Lead and manage a team of engineers, providing guidance and mentorship.
  • Oversee the design, development, and deployment of high-quality software solutions.
  • Collaborate with cross-functional teams to define, design, and deliver new features.
  • Ensure best practices in coding, architecture, and security are followed.

Requirements:

  • Proven experience as an Engineering Manager or Tech Lead.
  • Strong technical expertise in React.js, Flutter, Golang, and MongoDB.
  • Excellent leadership and communication skills.
  • Experience in the fintech industry, particularly in stockbroking, is a plus.
  • Ability to work in a fast-paced, agile environment.

Qualifications:

  • Minimum of 5+ years of experience in a technical management role.
  • Bachelor's degree in Technology
  • Strong project management skills, including the ability to prioritize and manage multiple tasks simultaneously.
  • Excellent leadership and communication skills.
  • Problem-solving and decision-making abilities.
  • Results-oriented with a focus on delivering high-quality solutions.

 

Other details

Expected CTC: Depending on Experience & Skills

In-Person based out of Okhla, New Delhi

 

Culture

We’re still early-stage and we believe that the culture is an ever-evolving process. Help build the kind of culture you want in the organization. Best ideas come from collaboration and we firmly believe in that. We have a flat hierarchy, flexible with timings and we believe in continuous learning and adapting to changing needs. We want to scale fast but sustainably, keeping everyone’s growth in mind. We aim to make this job your last job.

 

Read more
BigThinkCode Technologies
Kumar AGS
Posted by Kumar AGS
Chennai
3 - 5 yrs
₹3L - ₹12L / yr
pytest
Web API
STLC
Systems Development Life Cycle (SDLC)
JIRA
+4 more

At BigThinkCode, our technology solves complex problems. We are looking for Senior test engineer to join us at Chennai. This is an opportunity to join a growing team and make a substantial impact at BigThinkCode Technologies.


Please find below our job description, if interested apply / reply sharing your profile to connect and discuss.


Company: BigThinkCode Technologies


URL: https://www.bigthinkcode.com/


Experience: 3 – 5 years


Level: Test engineer (Senior)


Location: Chennai (Work from Office)


Joining time: Immediate – 4 weeks of time.


Mandatory skill: API/Web services testing using Restful/SOAP, Web automation, Python programming language basics along with frameworks lile PyUnit or Pytest, TFS or Jira, OOPs concepts.


Required skills:

• 3 - 5 years or above software testing experience, including mobile app testing experience.

• In depth understanding of SDLC and STLC

• API and Webservices testing using Restful or SOAP or Postman tools.

• Basic oops concepts and Python programming language is nice to have.

• Hands on experience with testing frameworks like PyUnit (Unittest), Pytest, Nosetest, Doctest and Robot is required.

• Design, develop test cases and execute test.

• 2+ years of experience in TFS or Jira

• Identify application issues, report discrepancies, prepare test report and recommend improvements.

• Track the issues raised by UAT and Production users..

• Be familiar with common test tools (Postman/Charles/Fiddler etc.), have basic understanding of HTTP/HTTPS protocol.

• Ownership attitude of the product and team player who fits in with her/his team.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Sukanya Mohan
Posted by Sukanya Mohan
Pune, Bengaluru (Bangalore)
5 - 10 yrs
Best in industry
skill iconAmazon Web Services (AWS)
EMR
skill iconPython
GLUE
SQL
+1 more

Greetings , Wissen Technology is Hiring for the position of Data Engineer

Please find the Job Description for your Reference:


JD

  • Design, develop, and maintain data pipelines on AWS EMR (Elastic MapReduce) to support data processing and analytics.
  • Implement data ingestion processes from various sources including APIs, databases, and flat files.
  • Optimize and tune big data workflows for performance and scalability.
  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
  • Manage and monitor EMR clusters, ensuring high availability and reliability.
  • Develop ETL (Extract, Transform, Load) processes to cleanse, transform, and store data in data lakes and data warehouses.
  • Implement data security best practices to ensure data is protected and compliant with relevant regulations.
  • Create and maintain technical documentation related to data pipelines, workflows, and infrastructure.
  • Troubleshoot and resolve issues related to data processing and EMR cluster performance.

 

 

Qualifications:

 

  • Bachelor’s degree in Computer Science, Information Technology, or a related field.
  • 5+ years of experience in data engineering, with a focus on big data technologies.
  • Strong experience with AWS services, particularly EMR, S3, Redshift, Lambda, and Glue.
  • Proficiency in programming languages such as Python, Java, or Scala.
  • Experience with big data frameworks and tools such as Hadoop, Spark, Hive, and Pig.
  • Solid understanding of data modeling, ETL processes, and data warehousing concepts.
  • Experience with SQL and NoSQL databases.
  • Familiarity with CI/CD pipelines and version control systems (e.g., Git).
  • Strong problem-solving skills and the ability to work independently and collaboratively in a team environment
Read more
product base company based at Bangalore location and working

product base company based at Bangalore location and working

Agency job
Remote only
4 - 9 yrs
₹20L - ₹30L / yr
Data Structures
Large Language Models (LLM) tuning
GPT
Llama2
Mistral
+9 more

We are seeking an experienced Data Scientist with a proven track record in Machine Learning, Deep Learning, and a demonstrated focus on Large Language Models (LLMs) to join our cutting-edge Data Science team. You will play a pivotal role in developing and deploying innovative AI solutions that drive real-world impact to patients and healthcare providers.

Responsibilities

• LLM Development and Fine-tuning: fine-tune, customize, and adapt large language models (e.g., GPT, Llama2, Mistral, etc.) for specific business applications and NLP tasks such as text classification, named entity recognition, sentiment analysis, summarization, and question answering. Experience in other transformer-based NLP models such as BERT, etc. will be an added advantage.

• Data Engineering: collaborate with data engineers to develop efficient data pipelines, ensuring the quality and integrity of large-scale text datasets used for LLM training and fine-tuning

• Experimentation and Evaluation: develop rigorous experimentation frameworks to evaluate model performance, identify areas for improvement, and inform model selection. Experience in LLM testing frameworks such as TruLens will be an added advantage.

• Production Deployment: work closely with MLOps and Data Engineering teams to integrate models into scalable production systems.

• Predictive Model Design and Implementation: leverage machine learning/deep learning and LLM methods to design, build, and deploy predictive models in oncology (e.g., survival models)

• Cross-functional Collaboration: partner with product managers, domain experts, and stakeholders to understand business needs and drive the successful implementation of data science solutions

• Knowledge Sharing: mentor junior team members and stay up to date with the latest advancements in machine learning and LLMs

Qualifications Required

• Doctoral or master’s degree in computer science, Data Science, Artificial Intelligence, or related field

• 5+ years of hands-on experience in designing, implementing, and deploying machine learning and deep learning models

• 12+ months of in-depth experience working with LLMs. Proficiency in Python and NLP-focused libraries (e.g., spaCy, NLTK, Transformers, TensorFlow/PyTorch).

• Experience working with cloud-based platforms (AWS, GCP, Azure)

Additional Skills

• Excellent problem-solving and analytical abilities

• Strong communication skills, both written and verbal

• Ability to thrive in a collaborative and fast-paced environment

Read more
Tazapay

at Tazapay

1 recruiter
Harshini  N M
Posted by Harshini N M
Chennai, Bengaluru (Bangalore)
6 - 11 yrs
₹25L - ₹35L / yr
skill iconPython
skill iconGo Programming (Golang)
skill icon.NET
skill iconRuby

 

Job Title - Senior Backend Engineer

About Tazapay 

Tazapay is a cross border payment service provider. They offer local collections via local payment methods, virtual accounts and cards in over 70 markets. The merchant does not need to create local entities anywhere and Tazapay offers the additional compliance framework to take care of local regulations and requirements. This results in decreased transaction costs, fx transparency and higher auth rates. 


They are licensed and backed by leading investors.   www.tazapay.com 


What’s exciting waiting for You? 

This is an amazing opportunity for you to join a fantastic crew before the rocket ship launch. It will be a story you will carry with you through your life and have the unique experience of building something ground up and have the satisfaction of seeing your product being used and paid for by thousands of customers. You will be a part of a growth story be it anywhere - Sales, Software Development, Marketing, HR, Accounting etc. 

We believe in a culture of openness, innovation & great memories together. 


Are you ready for the ride? 

Find what interesting things you could do with us. 


About the Backend Engineer role

Responsibilities (not exhaustive)


  • Design, write and deliver highly scalable, reliable and fault tolerant systems with minimal guidance 
  • Participate in code and design reviews to maintain our high development standards
  • Partner with the product management team to define and execute the feature roadmap
  • Translate business requirements into scalable and extensible design
  • Proactively manage stakeholder communication related to deliverables, risks, changes and dependencies
  • Coordinate with cross functional teams (Mobile, DevOps, Data, UX, QA etc.) on planning and execution
  • Continuously improve code quality, product execution, and customer delight
  • Willingness to learn new languages and methodologies
  • An enormous sense of ownership
  • Engage in service capacity and demand planning, software performance analysis, tuning and optimization


The Ideal Candidate

 

Education

  • Degree in Computer Science or equivalent with 5+ years of experience in commercial software development in large distributed systems

Experience

  • Hands-on experience in designing, developing, testing and deploying applications on Golang, Ruby,Python, .Net Core or Java for large scale applications
  • Deep knowledge of Linux as a production environment
  • Strong knowledge of data structures, algorithms, distributed systems, and asynchronous architectures
  • Expert in at least 1 of the following languages: Golang, Python, Ruby, Java, C, C++
  • Proficient in OOP, including design patterns. 
  • Ability to design and implement low latency RESTful services
  • Hands-on coder who has built backend services that handle high volume traffic.
  • Strong understanding of system performance and scaling
  • Possess excellent communication, sharp analytical abilities with proven design skills, able to think critically of the current system in terms of growth and stability
  • Data modeling experience in both Relational and NoSQL databases
  • Continuously refactor applications to ensure high-quality design
  • Ability to plan, prioritize, estimate and execute releases with good degree of predictability
  • Ability to scope, review and refine user stories for technical completeness and to alleviate dependency risks
  • Passion for learning new things, solving challenging problems
  • Ability to get stuff done!


Nice to have

  • Familiarity with Golang ecosystem
  • Familiarity with running web services at scale; understanding of systems internals and networking are a plus
  • Be familiar with HTTP/HTTPS communication protocols.

Abilities and Traits

  • Ability to work under pressure and meet deadlines
  • Ability to provide exceptional attention to details of the product.
  • Ability to focus for extended periods of repetitious activity.
  • Ability to think ahead and anticipate problems, issues and solutions
  • Work well as a team player and help the team members to resolve issues
  • Be committed to quality and be structured in approach
  • Excellent and demonstrable concept formulation, logical and analytical skills
  • Excellent planning, organizational, and prioritization skills

 

Location - Chennai - India


Join our team and let's groove together to the rhythm of innovation and opportunity!




Your Buddy

Tazapay



Read more
Sarvaha Systems Private Limited
Remote, Pune
5 - 10 yrs
₹20L - ₹40L / yr
TypeScript
skill iconPython
BDD
Test Automation (QA)
skill iconJenkins
+2 more

Sarvaha would like to welcome talented Software Development Engineer in Test (SDET) with minimum 5 years of experience to join our team. As an SDET, you will champion the quality of the product and will design, develop, and maintain modular, extensible, and reusable test cases/scripts. This is a hands-on role which requires you to work with automation test developers and application developers to enhance the quality of the products and development practices. Please visit our website at http://www.sarvaha.com to know more about us.


Key Responsibilities


  • Understand requirements through specification or exploratory testing, estimate QA efforts, design test strategy, develop optimal test cases, maintain RTM
  • Design, develop & maintain a scalable test automation framework
  • Build interfaces to seamlessly integrate testing with development environments.
  • Create & manage test setups that prioritize scalability, remote accessibility and reliability.
  • Automate test scripts, create and execute relevant test suites, analyze test results and enhance existing or build newer scripts for coverage. Communicate with stakeholders for requirements, troubleshooting etc; provide visibility into the works by sharing relevant reports and metrics
  • Stay up-to-date with industry best practices in testing methodologies and technologies to advise QA and integration teams.



Skills Required


  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field (Software Engineering preferred).
  • Minimum 5+ years of experience in testing enterprise-grade/highly scalable, distributed applications, products, and services.
  • Expertise in manual and Automation testing with excellent understanding of test methodologies and test design techniques, test life cycle.
  • Strong programming skills in Typescript and Python, with experience using Playwright for building hybrid/BDD frameworks for Website and API automation
  • Very good problem-solving and analytical skills.
  • Experience in databases, both SQL and No-SQL
  • Practical experience in setting up CI/CD pipelines (ideally with Jenkins).
  • Exposure to Docker, Kubernetes and EKS is highly desired.
  • C# experience is an added advantage. 
  • A continuous learning mindset and a passion for exploring new technologies.
  • Excellent communication, collaboration, quick learning of needed language/scripting and influencing skills.


Position Benefits


  • Competitive salary and excellent growth opportunities within a dynamic team.
  • Positive and collaborative work environment with the opportunity to learn from talented colleagues.
  • Highly challenging and rewarding software development problems to solve.
  • Hybrid work model with established remote work options.


Read more
Supercoder
Yukta Doshi
Posted by Yukta Doshi
Remote only
3 - 4 yrs
₹10L - ₹30L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+5 more

Key Responsibilities: 


  • Develop and maintain APIs using Django Rest Framework (DRF).
  • Utilize Postgres with Django ORM for database management.
  • Configuring and managing servers based on AWS to ensure the smooth operation of our platform.
  • Implement and manage asynchronous tasks and scheduling using Celery and Redis.
  • Optimize data storage and retrieval with Redis Cache.
  • Setup and manage development and deployment environments using Docker.
  • Develop backend office applications using Vue.js.
  • Collaborate with cross-functional teams to ensure seamless integration of backend and frontend systems.
  • Handle both backend and backend office development tasks efficiently.


Qualifications: 


  • Expertise in implementing and managing asynchronous tasks and scheduling using Celery and Redis.
  • Familiarity with Redis Cache for efficient data storage and retrieval.
  • Hands-on experience with server configuration and management based on AWS.
  • Proficiency in deployment and development environment setup using Docker.
  • Proven experience in developing APIs using Django Rest Framework (DRF).
  • Strong background in utilizing Postgres with Django ORM.
  • Demonstrated expertise in backend development for platform services.
  • Proficiency in frontend development for backend office applications using Vue.js.
  • Ability to handle both backend and backend office development tasks effectively.
  • Front-end development skills are at the level of the administrator's site.


Read more
CallHub

at CallHub

4 candid answers
1 video
Eman Khan
Posted by Eman Khan
Bengaluru (Bangalore)
1 - 3 yrs
₹14L - ₹20L / yr
skill iconPython
skill iconDjango
skill iconJavascript
skill iconReact.js
skill iconHTML/CSS
+4 more

About

Founded in 2011, CallHub provides cloud based communication software for nonprofits, political parties, advocacy organizations and businesses. We have delivered over 200 millions messages and calls for thousands of customers. We help political candidates during their campaigns in getting their message across to their voters, conduct surveys, manage event/town-hall invites and with recruiting volunteers for election campaigns. We are profitable with 8000+ paying customers in 200 countries across North America, Australia and Europe. Our customers include Uber, the Democratic Party, major political parties in the US, Canada, UK, France and Australia. 


Your responsibilities:

  • Design & develop world-class, scalable & reliable and highly available products/services in the SaaS space.
  • Writing clean, great quality, scalable and maintainable code which are covered by automated tests.
  • Understand requirements and deliver new features and services at a fast pace.
  • Use MVC frameworks, SOA concepts and design patterns to build loosely coupled services and systems.
  • Participate in the entire lifecycle of the product - design, documentation, coding, testing and deployment.
  • Imbibe and maintain a strong customer delight attitude while designing and building products.
  • Communicate well with product and relevant stakeholders.


What we’re looking for:

  • 1-3 yrs experience in developing applications, web services using Python and Django.
  • Working knowledge of developing responsive user interfaces using JavaScript, React, HTML and CSS
  • Good grasp of computer science fundamentals - data structures and algorithms.
  • Good understanding of SQL DB and NoSQL like Redis, Mongo etc.
  • Ability to pick up new technologies and assess situations quickly.
  • Detail oriented. Ability to empathize with customers.
  • Good written and verbal communication skills.
  • Team player with extreme ownership and strong interpersonal skills, willing to ask for help and offer support to the rest of the team.
  • BE/MS/MCA from reputed institutes in India or abroad


What you can look forward to:


  • You will be working independently on the requirements so will get a holistic exposure to complete software development processes including system design, development (backend, frontend), QA and DevOps.
  • You will get to see your work directly impacting users in a big way.
  • You will have the opportunity to work on the latest technologies as we are constantly innovating to provide reliable and scalable solutions for our customers.
  • We value openness in the company and love delighting our customers.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort