Cutshort logo
Python Jobs in Hyderabad

50+ Python Jobs in Hyderabad | Python Job openings in Hyderabad

Apply to 50+ Python Jobs in Hyderabad on CutShort.io. Explore the latest Python Job opportunities across top companies like Google, Amazon & Adobe.

icon
Caw Studios

at Caw Studios

29 recruiters
Stuti Jain
Posted by Stuti Jain
Hyderabad
5 - 8 yrs
₹22L - ₹28L / yr
skill iconPython
skill iconGo Programming (Golang)
RESTful APIs
skill iconMongoDB
MySQL

We are seeking an experienced Senior Golang Developer to join our dynamic engineering team at our Hyderabad office (Hybrid option available).


What You'll Do:

  • Collaborate with a team of engineers to design, develop, and support web and mobile applications using Golang.
  • Work in a fast-paced agile environment, delivering high-quality solutions focused on continuous innovation.
  • Tackle complex technical challenges with creativity and out-of-the-box thinking.
  • Take ownership of critical components and gradually assume responsibility for significant portions of the product.
  • Develop robust, scalable, and performant backend systems using Golang.
  • Contribute to all phases of the development lifecycle, including design, coding, testing, and deployment.
  • Build and maintain SQL and NoSQL databases to support application functionality.
  • Document your work and collaborate effectively with cross-functional teams, including QA, engineering, and business units.
  • Work with global teams to architect solutions, provide estimates, reduce complexity, and deliver a world-class platform.


Who Should Apply:

  • 5+ years of experience in backend development with a strong focus on Golang.
  • Proficient in building and deploying RESTful APIs and microservices.
  • Experience with SQL and NoSQL databases (e.g., MySQL, MongoDB).
  • Familiarity with cloud platforms such as AWS and strong Linux skills.
  • Hands-on experience with containerization and orchestration tools like Docker and Kubernetes.
  • Knowledge of system design principles, scalability, and high availability.
  • Exposure to frontend technologies like React or mobile development is a plus.
  • Experience working in an Agile/Scrum environment.


Read more
Fractal Analytics

at Fractal Analytics

5 recruiters
Eman Khan
Posted by Eman Khan
Bengaluru (Bangalore), Hyderabad, Gurugram, Noida, Mumbai, Pune, Coimbatore, Chennai
3 - 5 yrs
₹18L - ₹26L / yr
MLOps
MLFlow
kubeflow
skill iconMachine Learning (ML)
skill iconPython
+6 more

Building the machine learning production (or MLOps) is the biggest challenge most large companies currently have in making the transition to becoming an AI-driven organization. This position is an opportunity for an experienced, server-side developer to build expertise in this exciting new frontier. You will be part of a team deploying state-of-the-art AI solutions for Fractal clients.


Responsibilities

As MLOps Engineer, you will work collaboratively with Data Scientists and Data engineers to deploy and operate advanced analytics machine learning models. You’ll help automate and streamline Model development and Model operations. You’ll build and maintain tools for deployment, monitoring, and operations. You’ll also troubleshoot and resolve issues in development, testing, and production environments.

  • Enable Model tracking, model experimentation, Model automation
  • Develop ML pipelines to support
  • Develop MLOps components in Machine learning development life cycle using Model Repository (either of): MLFlow, Kubeflow Model Registry
  • Develop MLOps components in Machine learning development life cycle using Machine Learning Services (either of): Kubeflow, DataRobot, HopsWorks, Dataiku or any relevant ML E2E PaaS/SaaS
  • Work across all phases of Model development life cycle to build MLOPS components
  • Build the knowledge base required to deliver increasingly complex MLOPS projects on Azure
  • Be an integral part of client business development and delivery engagements across multiple domains


Required Qualifications

  • 3-5 years experience building production-quality software.
  • B.E/B.Tech/M.Tech in Computer Science or related technical degree OR Equivalent
  • Strong experience in System Integration, Application Development or Data Warehouse projects across technologies used in the enterprise space
  • Knowledge of MLOps, machine learning and docker
  • Object-oriented languages (e.g. Python, PySpark, Java, C#, C++)
  • CI/CD experience( i.e. Jenkins, Git hub action,
  • Database programming using any flavors of SQL
  • Knowledge of Git for Source code management
  • Ability to collaborate effectively with highly technical resources in a fast-paced environment
  • Ability to solve complex challenges/problems and rapidly deliver innovative solutions
  • Foundational Knowledge of Cloud Computing on Azure
  • Hunger and passion for learning new skills
Read more
Fractal Analytics

at Fractal Analytics

5 recruiters
Eman Khan
Posted by Eman Khan
Bengaluru (Bangalore), Gurugram, Mumbai, Hyderabad, Pune, Noida, Coimbatore, Chennai
5.5 - 9 yrs
₹25L - ₹38L / yr
MLOps
MLFlow
kubeflow
skill iconMachine Learning (ML)
skill iconPython
+6 more

Building the machine learning production System(or MLOps) is the biggest challenge most large companies currently have in making the transition to becoming an AI-driven organization. This position is an opportunity for an experienced, server-side developer to build expertise in this exciting new frontier. You will be part of a team deploying state-ofthe-art AI solutions for Fractal clients.


Responsibilities

As MLOps Engineer, you will work collaboratively with Data Scientists and Data engineers to deploy and operate advanced analytics machine learning models. You’ll help automate and streamline Model development and Model operations. You’ll build and maintain tools for deployment, monitoring, and operations. You’ll also troubleshoot and resolve issues in development, testing, and production environments.

  • Enable Model tracking, model experimentation, Model automation
  • Develop scalable ML pipelines
  • Develop MLOps components in Machine learning development life cycle using Model Repository (either of): MLFlow, Kubeflow Model Registry
  • Machine Learning Services (either of): Kubeflow, DataRobot, HopsWorks, Dataiku or any relevant ML E2E PaaS/SaaS
  • Work across all phases of Model development life cycle to build MLOPS components
  • Build the knowledge base required to deliver increasingly complex MLOPS projects on Azure
  • Be an integral part of client business development and delivery engagements across multiple domains


Required Qualifications

  • 5.5-9 years experience building production-quality software
  • B.E/B.Tech/M.Tech in Computer Science or related technical degree OR equivalent
  • Strong experience in System Integration, Application Development or Datawarehouse projects across technologies used in the enterprise space
  • Expertise in MLOps, machine learning and docker
  • Object-oriented languages (e.g. Python, PySpark, Java, C#, C++)
  • Experience developing CI/CD components for production ready ML pipeline.
  • Database programming using any flavors of SQL
  • Knowledge of Git for Source code management
  • Ability to collaborate effectively with highly technical resources in a fast-paced environment
  • Ability to solve complex challenges/problems and rapidly deliver innovative solutions
  • Team handling, problem solving, project management and communication skills & creative thinking
  • Foundational Knowledge of Cloud Computing on Azure
  • Hunger and passion for learning new skills
Read more
Fractal Analytics

at Fractal Analytics

5 recruiters
Eman Khan
Posted by Eman Khan
Bengaluru (Bangalore), Hyderabad, Gurugram, Noida, Mumbai, Pune, Chennai, Coimbatore
5.5 - 9 yrs
₹25L - ₹38L / yr
Langchain
Large Language Models (LLM)
Retrieval Augmented Generation (RAG)
LangChain
Artificial Intelligence (AI)
+8 more

Responsibilities

  • Design and implement advanced solutions utilizing Large Language Models (LLMs).
  • Demonstrate self-driven initiative by taking ownership and creating end-to-end solutions.
  • Conduct research and stay informed about the latest developments in generative AI and LLMs.
  • Develop and maintain code libraries, tools, and frameworks to support generative AI development.
  • Participate in code reviews and contribute to maintaining high code quality standards.
  • Engage in the entire software development lifecycle, from design and testing to deployment and maintenance.
  • Collaborate closely with cross-functional teams to align messaging, contribute to roadmaps, and integrate software into different repositories for core system compatibility.
  • Possess strong analytical and problem-solving skills.
  • Demonstrate excellent communication skills and the ability to work effectively in a team environment.


Primary Skills

  • Generative AI: Proficiency with SaaS LLMs, including Lang chain, llama index, vector databases, Prompt engineering (COT, TOT, ReAct, agents). Experience with Azure OpenAI, Google Vertex AI, AWS Bedrock for text/audio/image/video modalities.
  • Familiarity with Open-source LLMs, including tools like TensorFlow/Pytorch and Huggingface. Techniques such as quantization, LLM finetuning using PEFT, RLHF, data annotation workflow, and GPU utilization.
  • Cloud: Hands-on experience with cloud platforms such as Azure, AWS, and GCP. Cloud certification is preferred.
  • Application Development: Proficiency in Python, Docker, FastAPI/Django/Flask, and Git.
  • Natural Language Processing (NLP): Hands-on experience in use case classification, topic modeling, Q&A and chatbots, search, Document AI, summarization, and content generation.
  • Computer Vision and Audio: Hands-on experience in image classification, object detection, segmentation, image generation, audio, and video analysis.
Read more
Wallero technologies

at Wallero technologies

5 recruiters
Hari krishna
Posted by Hari krishna
Hyderabad
4 - 6 yrs
₹20L - ₹25L / yr
skill iconData Science
Computer Vision
skill iconPython
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
+1 more
  • Data Scientist with 4+ yrs of experience
  • Good working experience in Computer vision and ML engineering
  • Strong knowledge of statistical modeling, hypothesis testing, and regression analysis
  • Should be developing APIs
  • Proficiency in Python, SQL
  • Should have Azure knowledge
  • Basic knowledge of NLP
  • Analytical thinking and problem-solving abilities
  • Excellent communication, Strong collaboration skills
  • Should be able to work independently
  • Attention to detail and commitment to high-quality results
  • Adaptability to fast-paced, dynamic environments


Read more
Highfly Sourcing

at Highfly Sourcing

2 candid answers
Highfly Hr
Posted by Highfly Hr
Dubai, Augsburg, Germany, Zaragoza (Spain), Qatar, Salalah (Oman), Kuwait, Lebanon, Marseille (France), Genova (Italy), Winnipeg (Canada), Denmark, Poznan (Poland), Bengaluru (Bangalore), Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Mumbai, Hyderabad, Pune
3 - 10 yrs
₹25L - ₹30L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+14 more

Job Description

We are looking for a talented Java Developer to work in abroad countries. You will be responsible for developing high-quality software solutions, working on both server-side components and integrations, and ensuring optimal performance and scalability.


Preferred Qualifications

  • Experience with microservices architecture.
  • Knowledge of cloud platforms (AWS, Azure).
  • Familiarity with Agile/Scrum methodologies.
  • Understanding of front-end technologies (HTML, CSS, JavaScript) is a plus.


Requirment Details

Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent experience).

Proven experience as a Java Developer or similar role.

Strong knowledge of Java programming language and its frameworks (Spring, Hibernate).

Experience with relational databases (e.g., MySQL, PostgreSQL) and ORM tools.

Familiarity with RESTful APIs and web services.

Understanding of version control systems (e.g., Git).

Solid understanding of object-oriented programming (OOP) principles.

Strong problem-solving skills and attention to detail.

Read more
golden eagle it technologies pvt ltd
Akansha Kanojia
Posted by Akansha Kanojia
Hyderabad
4 - 6 yrs
₹20L - ₹22L / yr
skill iconPython
skill iconDjango
skill iconFlask
skill iconGit
skill iconDocker
+4 more

Key Responsibilities:


● Work closely with product managers, designers, frontend developers, and other


cross-functional teams to ensure the seamless integration and alignment of frontend and


backend technologies, driving cohesive and high-quality product delivery.


● Develop and implement coding standards and best practices for the backend team.


● Document technical specifications and procedures.


● Stay up-to-date with the latest backend technologies, trends, and best practices.


● Collaborate with other departments to identify and address backend-related issues.


● Conduct code reviews and ensure code quality and consistency across the backend team.


● Create technical documentation, ensuring clarity for future development and


maintenance.


Requirements;


● Experience: 4-6 years of hands-on experience in backend development, with a strong


background in product-based companies or startups.


● Education: Bachelor’s degree or above in Computer Science or a related field.


● Programming skills: Proficient in Python and software development principles, with a


focus on clean, maintainable code, and industry best practices. Experienced in unit


testing, AI-driven code reviews, version control with Git, CI/CD pipelines using GitHub


Actions, and integrating New Relic for logging and APM into backend systems.


● Database Development: Proficiency in developing and optimizing backend systems in


both relational and non-relational database environments, such as MySQL and NoSQL


databases.


● GraphQL: Proven experience in developing and managing robust GraphQL APIs,


preferably using Apollo Server. Ability to design type-safe GraphQL schemas and


resolvers, ensuring seamless integration and high performance.


● Cloud Platforms: Familiar with AWS and experienced in Docker containerization and


orchestrating containerized systems.


● System Architecture: Proficient in system design and architecture with experience in


developing multi-tenant platforms, including security implementation, user onboarding,


payment integration, and scalable architecture.


● Linux Systems: Familiarity with Linux systems is mandatory, including deployment and


management.


● Continuous Learning: Stay current with industry trends and emerging technologies to


influence architectural decisions and drive continuous improvement.


Benefits:


● Competitive salary.


● Health insurance.


● Casual dress code.


● Dynamic & Collaboration friendly office.


● Hybrid work schedule.

Industry

  • IT Services and IT Consulting

Employment Type

Full-time





Read more
Hyderabad
3 - 6 yrs
₹10L - ₹16L / yr
SQL
Spark
Analytical Skills
Hadoop
Communication Skills
+4 more

The Sr. Analytics Engineer would provide technical expertise in needs identification, data modeling, data movement, and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective.


Understands and leverages best-fit technologies (e.g., traditional star schema structures, cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges.


Provides data understanding and coordinates data-related activities with other data management groups such as master data management, data governance, and metadata management.


Actively participates with other consultants in problem-solving and approach development.


Responsibilities :


Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs.


Perform data analysis to validate data models and to confirm the ability to meet business needs.


Assist with and support setting the data architecture direction, ensuring data architecture deliverables are developed, ensuring compliance to standards and guidelines, implementing the data architecture, and supporting technical developers at a project or business unit level.


Coordinate and consult with the Data Architect, project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels.


Work closely with Business Analysts and Solution Architects to design the data model satisfying the business needs and adhering to Enterprise Architecture.


Coordinate with Data Architects, Program Managers and participate in recurring meetings.


Help and mentor team members to understand the data model and subject areas.


Ensure that the team adheres to best practices and guidelines.


Requirements :


- Strong working knowledge of at least 3 years of Spark, Java/Scala/Pyspark, Kafka, Git, Unix / Linux, and ETL pipeline designing.


- Experience with Spark optimization/tuning/resource allocations


- Excellent understanding of IN memory distributed computing frameworks like Spark and its parameter tuning, writing optimized workflow sequences.


- Experience of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., Redshift, Bigquery, Cassandra, etc).


- Familiarity with Docker, Kubernetes, Azure Data Lake/Blob storage, AWS S3, Google Cloud storage, etc.


- Have a deep understanding of the various stacks and components of the Big Data ecosystem.


- Hands-on experience with Python is a huge plus

Read more
Janapriya Educational society
Miyapur, Hyderabad
0 - 5 yrs
₹2.5L - ₹3.5L / yr
skill iconPython
skill iconJavascript
skill iconC++
skill iconHTML/CSS
Scratch
+1 more

Hi,

I am HR from Janapriya school , Miyapur , Hyderabad , Telangana.

Currently we are looking for a primary computer teacher .

the teacher should have atleast 2 years experience in teaching computers .

Intrested candidates can apply to the above posting.

Read more
Truminds Software Systems
Sonali Pandey
Posted by Sonali Pandey
Hyderabad
2 - 3 yrs
₹5L - ₹7L / yr
skill iconC
skill iconC++
Linux/Unix
skill iconPython
skill iconGit
+1 more

Mandatory Skills

  • C/C++ Programming
  • Linux System concepts
  • Good Written and verbal communication skills
  • Good problem-solving skills
  • Python scripting experience
  • Prior experience in Continuous Integration and Build System is a plus
  • SCM tools like git, perforce etc is a plus
  • Repo, Git and Gerrit tools
  • Android Build system expertise
  • Automation development experience with like Electric Commander, Jenkins, Hudson


Read more
Zweeny Pvt Ltd
Preeti Harshavardhan
Posted by Preeti Harshavardhan
Hyderabad
5 - 8 yrs
₹24L - ₹40L / yr
skill iconPython

The Technical Lead will oversee all aspects of application development at TinyPal. This position involves both managing the development team and actively contributing to the coding and architecture, particularly in the backend development using Python. The ideal candidate will bring a strategic perspective to the development process, ensuring that our solutions are robust, scalable, and aligned with our business goals.



Key Responsibilities:

  • Lead and manage the application development team across all areas, including backend, frontend, and mobile app development.
  • Hands-on development and oversight of backend systems using Python, ensuring high performance, scalability, and integration with frontend services.
  • Architect and design innovative solutions that meet market needs and are aligned with the company’s technology strategy, with a strong focus on embedding AI technologies to enhance app functionalities.
  • Coordinate with product managers and other stakeholders to translate business needs into technical strategies, particularly in leveraging AI to solve complex problems and improve user experiences.
  • Maintain high standards of software quality by establishing good practices and habits within the development team.
  • Evaluate and incorporate new technologies and tools to improve application development processes, with a particular emphasis on AI and machine learning technologies.
  • Mentor and support team members to foster a collaborative and productive environment.
  • Lead the deployment and continuous integration of applications across various platforms, ensuring AI components are well integrated and perform optimally.


Required Skills and Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field.
  • Minimum of 7 years of experience in software development, with at least 1 year in a leadership role.
  • Expert proficiency in Python and experience with frameworks like Django or Flask.
  • Broad experience in full lifecycle development of large-scale applications.
  • Strong architectural understanding of both frontend and backend technologies, with a specific capability in integrating AI into complex systems.
  • Experience with cloud platforms (AWS, Azure, Google Cloud), and understanding of DevOps and CI/CD processes.
  • Demonstrated ability to think strategically about business, product, and technical challenges, including the adoption and implementation of AI solutions.
  • Excellent team management, communication, and interpersonal skills.



Read more
Hyderabad
0 - 1 yrs
₹2L - ₹3L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+7 more

Description:

We are looking for a highly motivated Full Stack Backend Software Intern to join our team. The ideal candidate should have a strong interest in AI, LLM (Large Language Models), and related technologies, along with the ability to work independently and complete tasks with minimal supervision.


Responsibilities:

  • Research and gather requirements for backend software projects.
  • Develop, test, and maintain backend components of web applications.
  • Collaborate with front-end developers to integrate user-facing elements with server-side logic.
  • Optimize applications for maximum speed and scalability.
  • Implement security and data protection measures.
  • Stay up-to-date with emerging technologies and industry trends.
  • Complete tasks with minimal hand-holding and supervision.
  • Assist with frontend tasks using JavaScript and React if required.


Requirements:

  • Proficiency in backend development languages such as Python or Node.js
  • Familiarity with frontend technologies like HTML, CSS, JavaScript, and React.
  • Experience with relational and non-relational databases.
  • Understanding of RESTful APIs and microservices architecture.
  • Knowledge of AI, LLM, and related technologies is a plus.
  • Ability to work independently and complete tasks with minimal supervision.
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and teamwork skills.
  • Currently pursuing or recently completed a degree in Computer Science or related field.


Benefits:

  • Opportunity to work on cutting-edge technologies in AI and LLM.
  • Hands-on experience in developing backend systems for web applications.
  • Mentorship from experienced developers and engineers.
  • Flexible working hours and a supportive work environment.
  • Possibility of a full-time position based on performance.

If you are passionate about backend development, AI, and LLM, and are eager to learn and grow in a dynamic environment, we would love to hear from you. Apply now to join our team as a Full Stack Backend Software Intern.

Read more
Frisco Analytics Pvt Ltd
Cedrick Mariadas
Posted by Cedrick Mariadas
Bengaluru (Bangalore), Hyderabad
5 - 8 yrs
₹15L - ₹20L / yr
databricks
Apache Spark
skill iconPython
SQL
MySQL
+3 more

We are actively seeking a self-motivated Data Engineer with expertise in Azure cloud and Databricks, with a thorough understanding of Delta Lake and Lake-house Architecture. The ideal candidate should excel in developing scalable data solutions, crafting platform tools, and integrating systems, while demonstrating proficiency in cloud-native database solutions and distributed data processing.


Key Responsibilities:

  • Contribute to the development and upkeep of a scalable data platform, incorporating tools and frameworks that leverage Azure and Databricks capabilities.
  • Exhibit proficiency in various RDBMS databases such as MySQL and SQL-Server, emphasizing their integration in applications and pipeline development.
  • Design and maintain high-caliber code, including data pipelines and applications, utilizing Python, Scala, and PHP.
  • Implement effective data processing solutions via Apache Spark, optimizing Spark applications for large-scale data handling.
  • Optimize data storage using formats like Parquet and Delta Lake to ensure efficient data accessibility and reliable performance.
  • Demonstrate understanding of Hive Metastore, Unity Catalog Metastore, and the operational dynamics of external tables.
  • Collaborate with diverse teams to convert business requirements into precise technical specifications.

Requirements:

  • Bachelor’s degree in Computer Science, Engineering, or a related discipline.
  • Demonstrated hands-on experience with Azure cloud services and Databricks.
  • Proficient programming skills in Python, Scala, and PHP.
  • In-depth knowledge of SQL, NoSQL databases, and data warehousing principles.
  • Familiarity with distributed data processing and external table management.
  • Insight into enterprise data solutions for PIM, CDP, MDM, and ERP applications.
  • Exceptional problem-solving acumen and meticulous attention to detail.

Additional Qualifications :

  • Acquaintance with data security and privacy standards.
  • Experience in CI/CD pipelines and version control systems, notably Git.
  • Familiarity with Agile methodologies and DevOps practices.
  • Competence in technical writing for comprehensive documentation.


Read more
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Pune, Hyderabad, Ahmedabad, Chennai
3 - 7 yrs
₹8L - ₹15L / yr
AWS Lambda
Amazon S3
Amazon VPC
Amazon EC2
Amazon Redshift
+3 more

Technical Skills:


  • Ability to understand and translate business requirements into design.
  • Proficient in AWS infrastructure components such as S3, IAM, VPC, EC2, and Redshift.
  • Experience in creating ETL jobs using Python/PySpark.
  • Proficiency in creating AWS Lambda functions for event-based jobs.
  • Knowledge of automating ETL processes using AWS Step Functions.
  • Competence in building data warehouses and loading data into them.


Responsibilities:


  • Understand business requirements and translate them into design.
  • Assess AWS infrastructure needs for development work.
  • Develop ETL jobs using Python/PySpark to meet requirements.
  • Implement AWS Lambda for event-based tasks.
  • Automate ETL processes using AWS Step Functions.
  • Build data warehouses and manage data loading.
  • Engage with customers and stakeholders to articulate the benefits of proposed solutions and frameworks.
Read more
Seneca Global IT Services Pvt Ltd
Hyderabad
1 - 3 yrs
₹4L - ₹8L / yr
skill iconPython
skill iconFlask
FastAPI
SQLAlchemy
Web Scraping

Responsibilities

·       Develop Python-based APIs using FastAPI and Flask frameworks.

·       Develop Python-based Automation scripts and Libraries.

·       Develop Front End Components using VueJS and ReactJS.

·       Writing and modifying Docker files for the Back-End and Front-End Components.

·       Integrate CI/CD pipelines for Automation and Code quality checks.

·       Writing complex ORM mappings using SQLAlchemy.

 

Required Skills:

 

·       Strong experience in Python development in a full stack environment is a requirement, including NodeJS, VueJS/Vuex, Flask, etc.

·       Experience with SQLAchemy or similar ORM frameworks.

·       Experience working with Geolocation APIs (e.g., Google Maps, Mapbox).

·       Experience using Elasticsearch and Airflow is a plus.

·       Strong knowledge of SQL, comfortable working with MySQL and/or PostgreSQL databases.

·       Understand concepts of Data Modeling.

·       Experience with REST.

·       Experience with Git, GitFlow, and code review process.

·       Good understanding of basic UI and UX principles.

·       Project excellent problem-solving and communication skills.

 

Read more
EKSAQ

at EKSAQ

2 candid answers
Eksaq HR
Posted by Eksaq HR
Hyderabad
5 - 8 yrs
₹8L - ₹12L / yr
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconMongoDB
+2 more

Position Overview: 

We are searching for an experienced Senior MERN Stack Developer to lead our development 

efforts. Your expertise will drive the creation of cutting-edge web applications while 

mentoring junior developers and contributing to technical strategy.

Key Responsibilities:

• Lead and participate in the architecture, design, and development of complex 

applications.

• Mentor and guide junior developers, fostering skill development and growth.

• Collaborate with cross-functional teams to define technical roadmaps and strategies.

• Conduct code reviews and ensure adherence to coding standards and best practices.

• Stay updated with emerging technologies and advocate for their integration.

• Develop and maintain robust and scalable web applications using the MERN stack.

• Collaborate with front-end and back-end developers to define and implement 

innovative solutions.

• Design and implement RESTful APIs for seamless integration between front-end and 

back-end systems.

• Work closely with UI/UX designers to create responsive and visually appealing user 

interfaces.

• Troubleshoot, debug and optimize code to ensure high performance and reliability.

• Implement security and data protection measures in line with industry best practices.

• Stay updated on emerging trends and technologies in web development.

Qualifications & Skills:

• Bachelor’s or Master’s degree in Computer Science or related field.

• Proven experience as a Senior MERN Stack Developer.

• Strong proficiency in React.js, Node.js, Express.js, and MongoDB.

• Strong proficiency in Typescript, JavaScript, HTML, and CSS.

• Familiarity with front-end frameworks like Bootstrap, Material-UI, etc.

• Experience with version control systems, such as Git.

• Knowledge of database design and management, including both SQL and NoSQL 

databases.

• Leadership skills and the ability to guide and inspire a team.

• Excellent problem-solving abilities and a strategic mindset.

• Effective communication and collaboration skills.

• Knowledge of AWS and s3 cloud storage

Location: The position is based in Hyderabad.

Join us in revolutionizing education! If you are a passionate MERN Stack developer with a 

vision for transforming education in line with NEP2020, we would love to hear from you. Apply 

now and be part of an innovative team shaping the future of education.

Read more
Publicis Sapient

at Publicis Sapient

10 recruiters
Mohit Singh
Posted by Mohit Singh
Bengaluru (Bangalore), Pune, Hyderabad, Gurugram, Noida
5 - 11 yrs
₹20L - ₹36L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+7 more

Publicis Sapient Overview:

The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution 

.

Job Summary:

As Senior Associate L2 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution

The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. You are also required to have hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms.


Role & Responsibilities:

Your role is focused on Design, Development and delivery of solutions involving:

• Data Integration, Processing & Governance

• Data Storage and Computation Frameworks, Performance Optimizations

• Analytics & Visualizations

• Infrastructure & Cloud Computing

• Data Management Platforms

• Implement scalable architectural models for data processing and storage

• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time mode

• Build functionality for data analytics, search and aggregation

Experience Guidelines:

Mandatory Experience and Competencies:

# Competency

1.Overall 5+ years of IT experience with 3+ years in Data related technologies

2.Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS / Azure / GCP)

3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline.

4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable

5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc

6.Well-versed and working knowledge with data platform related services on at least 1 cloud platform, IAM and data security


Preferred Experience and Knowledge (Good to Have):

# Competency

1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience

2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc

3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures

4.Performance tuning and optimization of data pipelines

5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality

6.Cloud data specialty and other related Big data technology certifications


Personal Attributes:

• Strong written and verbal communication skills

• Articulation skills

• Good team player

• Self-starter who requires minimal oversight

• Ability to prioritize and manage multiple tasks

• Process orientation and the ability to define and set up processes


Read more
HNM Solutions

at HNM Solutions

1 recruiter
Yogitha Rani
Posted by Yogitha Rani
chennai,bangalore,hyderabad,kochi
3 - 5 yrs
₹6L - ₹12L / yr
skill iconPython
skill iconDjango

Role: Python-Django Developer 

Location: Noida, India


Description:

  • Develop web applications using Python and Django.
  • Write clean and maintainable code following best practices and coding standards.
  • Collaborate with other developers and stakeholders to design and implement new features.
  • Participate in code reviews and maintain code quality.
  • Troubleshoot and debug issues as they arise.
  • Optimize applications for maximum speed and scalability.
  • Stay up-to-date with emerging trends and technologies in web development.

Requirements:

  • Bachelor's or Master's degree in Computer Science, Computer Engineering or a related field.
  • 4+ years of experience in web development using Python and Django.
  • Strong knowledge of object-oriented programming and design patterns.
  • Experience with front-end technologies such as HTML, CSS, and JavaScript.
  • Understanding of RESTful web services.
  • Familiarity with database technologies such as PostgreSQL or MySQL.
  • Experience with version control systems such as Git.
  • Ability to work in a team environment and communicate effectively with team members.
  • Strong problem-solving and analytical skills.


Read more
Publicis Sapient

at Publicis Sapient

10 recruiters
Mohit Singh
Posted by Mohit Singh
Bengaluru (Bangalore), Gurugram, Pune, Hyderabad, Noida
4 - 10 yrs
Best in industry
PySpark
Data engineering
Big Data
Hadoop
Spark
+6 more

Publicis Sapient Overview:

The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution 

.

Job Summary:

As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution

The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.


Role & Responsibilities:

Job Title: Senior Associate L1 – Data Engineering

Your role is focused on Design, Development and delivery of solutions involving:

• Data Ingestion, Integration and Transformation

• Data Storage and Computation Frameworks, Performance Optimizations

• Analytics & Visualizations

• Infrastructure & Cloud Computing

• Data Management Platforms

• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time

• Build functionality for data analytics, search and aggregation


Experience Guidelines:

Mandatory Experience and Competencies:

# Competency

1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies

2.Minimum 1.5 years of experience in Big Data technologies

3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.

4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable

5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc


Preferred Experience and Knowledge (Good to Have):

# Competency

1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience

2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc

3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures

4.Performance tuning and optimization of data pipelines

5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality

6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security

7.Cloud data specialty and other related Big data technology certifications


Job Title: Senior Associate L1 – Data Engineering

Personal Attributes:

• Strong written and verbal communication skills

• Articulation skills

• Good team player

• Self-starter who requires minimal oversight

• Ability to prioritize and manage multiple tasks

• Process orientation and the ability to define and set up processes

Read more
Technogen India PvtLtd

at Technogen India PvtLtd

4 recruiters
Mounika G
Posted by Mounika G
Hyderabad
11 - 16 yrs
₹24L - ₹27L / yr
Data Warehouse (DWH)
Informatica
ETL
skill iconAmazon Web Services (AWS)
SQL
+1 more

Daily and monthly responsibilities

  • Review and coordinate with business application teams on data delivery requirements.
  • Develop estimation and proposed delivery schedules in coordination with development team.
  • Develop sourcing and data delivery designs.
  • Review data model, metadata and delivery criteria for solution.
  • Review and coordinate with team on test criteria and performance of testing.
  • Contribute to the design, development and completion of project deliverables.
  • Complete in-depth data analysis and contribution to strategic efforts
  • Complete understanding of how we manage data with focus on improvement of how data is sourced and managed across multiple business areas.

 

Basic Qualifications

  • Bachelor’s degree.
  • 5+ years of data analysis working with business data initiatives.
  • Knowledge of Structured Query Language (SQL) and use in data access and analysis.
  • Proficient in data management including data analytical capability.
  • Excellent verbal and written communications also high attention to detail.
  • Experience with Python.
  • Presentation skills in demonstrating system design and data analysis solutions.


Read more
GradRight

at GradRight

4 recruiters
Vivek Jadli
Posted by Vivek Jadli
Hyderabad, Gurugram
9 - 12 yrs
Best in industry
Engineering Management
skill iconNodeJS (Node.js)
skill iconVue.js
skill iconReact.js
skill iconGo Programming (Golang)
+4 more

GradRight is an ed-fin-tech startup focused on global higher education. Using data science, technology and strategic partnerships across the industry, we enable students to find the “Right University” at the “Right Cost”. We are on a mission to aid a million students to find their best-fit universities and financial offerings by 2025.

Our flagship product - FundRight is the world’s first student loan bidding platform. In a short span of 10 months, we have facilitated disbursements of more than $ 50 million in loans this year and we are poised to scale up rapidly.

We are launching our second product - SelectRight as an innovative approach to college selection and student recruitment for students and universities, respectively. The product rests on the three pillars of data science, transparency and ethics and hopes to create value for students and universities. 

 

Brief:

We are pursuing a complex set of problems that involve building for an international audience and for an industry that has largely been service-centric. As a Principal Engineer at GradRight, you’ll bring an unmatched customer-centricity to your work, with a focus on building for the long term and large scale.

You’ll drive the creation of frameworks that enable flexible/scalable customer journeys and tie them with institutional knowledge to help us build the best experiences for students and our partners. You’ll also manage a team of high performers to achieve the planned outcomes.

You’ll own the technology strategy of the engineering organization and be a key decision maker when it comes to processes and execution.

 

Responsibilities:

  1. Drive design discussions and decisions around building scalable and modular architecture
  2. Work with product, engineering and business teams to ideate on technology strategy and line up initiatives around the same
  3. Build clean, modular and scalable backend services
  4. Build clean, modular and scalable frontends
  5. Own quality and velocity of releases across the engineering organization
  6. Manage and mentor a team of engineers
  7. Participate in sprint ceremonies and actively contribute to scaling the engineering organization from a process perspective
  8. Stay on top of the software engineering ecosystem, propose and implement new technologies/methodologies as per the business needs
  9. Contribute to engineering hiring by conducting interviews
  10. Champion infrastructure-as-code mindset and encourage automation
  11. Identify problems around engineering processes, propose solutions and drive implementations for the same

 

Requirements:

  1. At least 8 years of experience, building large scale applications
  2. Experience working at startups in growth phase with war stories to share
  3. Experience with frontend technologies like vue.js or react.js
  4. Strong experience with at least one backend framework, preferably express.js
  5. Extensive experience in at least one programming language (preferably Javascript, GoLang) and ability to write maintainable, scalable and unit-testable code
  6. Experience in CI/CD and cloud infrastructure management
  7. Strong understanding of software design principles and patterns
  8. Excellent command over data structures and algorithms
  9. Passion for solving complex problems
  10. Good understanding of various database technologies with strong opinions around their use cases
  11. Experience with performance monitoring and scaling backend services
  12. Experience with microservices and distributed systems in general
  13. Experience with team management
  14. Excellent written and verbal communication skills


Good to have:

  1. Worked on products that addressed an international audience
  2. Worked on products that scaled to millions of users
  3. Exposure to emerging/latest technologies like blockchain, bots, AR/VR
  4. Exposure to AI/ML
Read more
Caw Studios

at Caw Studios

29 recruiters
Poojitha Mukkollu
Posted by Poojitha Mukkollu
Hyderabad
2 - 5 yrs
₹6L - ₹15L / yr
cypress
playwright
skill iconJavascript
Selenium
skill iconJava
+3 more

Ever dreamed of being part of new product initiatives? Feel the energy and excitement to work on version 1 of a product, and bring the idea on paper to life. Do you crave to work on SAAS products that can become the next Uber or Airbnb or Flipkart? We allow you to be part of a team that will be leading the development of a SAAS product.


Our organization relies on its central engineering workforce to develop and maintain a product portfolio of several different startups. Our product portfolio continuously grows as we incubate more startups, which means that different products are very likely to make use of different technologies, architecture & frameworks - a fun place for smart tech lovers! We are looking for a Software Development Engineer in Test to join one of our engineering teams at our office in Hyderabad.


What would you do?


● Improve automation code structure and framework architecture in terms of maintainability, execution speed, and coverage; write, co-write, and review test design/plan documentation.

● Own communication throughout the sprint/release cycle, quality of features, and delivery of the entire feature.

● Lead new language/framework POC within the technical focus area.

● Drive the design/code review process for test automation, seeking and providing constructive criticism.

● Ensure your team has strong sets of documentation and journals of how their test design and architecture/product evolve.

● Lead effort in working with other teams and counterparts to solve problems affecting the team's overall delivery.

● Participate in the prioritization of cross-team automation initiatives & lead those within your team.

● Participate/Support in production/Customer deployment.


Who Should Apply?

● 2-5 years of experience in professional testing.

● 1+ Experience working with Automation Testing.

● Familiarity with Microservices architecture.

● Deep understanding of Manual and automation test methodologies and principles.

● Strong problem-solving, interpersonal, organizational, and time management skills.

● Passion for self-improvement and continual learning.

● Great attitude and adaptability to taking on many diverse responsibilities.

● Experience working with Web and API Testing - both manual & automation.

● Experience using tools such as Jira, Git, Cypress, Kubernetes, Selenium, Docker, TestNG, CI/CD pipelines, JavaScript &TypeScript testing tools, and API Automation

● Experience in setting up test Infra for Functional and Non-Functional testing.

● Experience working with Agile process management methodology

● Experience in Performance/ Security Testing and Linux/ Unix commands.


About CAW Studios:

CAW Studios is a Product Engineering Company of 50+ super-geeks based out of Hyderabad. We run complete engineering (Dev + DevOps) for several products like Interakt, CashFlo, KaiPulse, and FastBar. We are also part of the global engineering teams for Haptik, EmailAnalytics, SenorPago, and GrowthZone. We are obsessed with automation, DevOps, OOPS, and SOLID. We are not into one tech stack - we are into solving problems.


Know More About CAW Studios:

Find us: https://goo.gl/maps/dvR6L26JUa42 Website: https://www.cawstudios.com/ Software Development And Test Engineer - Cypress Know More: https://www.cawstudios.com/handbook

Read more
Aprajita Consultancy

Aprajita Consultancy

Agency job
via Squarcell Resource India Pvt by Pranjali Reddy
Hyderabad
8 - 10 yrs
₹13L - ₹15L / yr
SQL server
Oracle
Cassandra
Terraform
Shell Scripting
+3 more

Role: Oracle DBA Developer


Location: Hyderabad


Required Experience: 8 + Years


Skills : DBA, Terraform, Ansible, Python, Shell Script, DevOps activities, Oracle DBA, SQL server, Cassandra, Oracle sql/plsql, MySQL/Oracle/MSSql/Mongo/Cassandra, Security measure configuration


cid:[email protected]


Roles and Responsibilities:


 


1. 8+ years of hands-on DBA experience in one or many of the following: SQL Server, Oracle, Cassandra


2. DBA experience in a SRE environment will be an advantage.


3. Experience in Automation/building databases by providing self-service tools. analyze and implement solutions for database administration (e.g., backups, performance tuning, Troubleshooting, Capacity planning)


4. Analyze solutions and implement best practices for cloud database and their components.


5. Build and enhance tooling, automation, and CI/CD workflows (Jenkins etc.) that provide safe self-service capabilities to th6. Implement proactive monitoring and alerting to detect issues before they impact users. Use a metrics-driven approach to identify and root cause performance and scalability bottlenecks in the system.


7. Work on automation of database infrastructure and help engineering succeed by providing self-service tools.


8. Write database documentation, including data standards, procedures, and definitions for the data dictionary (metadata)


9. Monitor database performance, control access permissions and privileges, capacity planning, implement changes and apply new patches and versions when required.


10. Recommend query and schema changes to optimize the performance of database queries.


11. Have experience with cloud-based environments (OCI, AWS, Azure) as well as On-Premises.


12. Have experience with cloud database such as SQL server, Oracle, Cassandra


13. Have experience with infrastructure automation and configuration management (Jira, Confluence, Ansible, Gitlab, Terraform)


14. Have excellent written and verbal English communication skills.


15. Planning, managing, and scaling of data stores to ensure a business’ complex data requirements are met and it can easily access its data in a fast, reliable, and safe manner.


16. Ensures the quality of orchestration and integration of tools needed to support daily operations by patching together existing infrastructure with cloud solutions and additional data infrastructures.


17. Data Security and protecting the data through rigorous testing of backup and recovery processes and frequently auditing well-regulated security procedures.


18. use software and tooling to automate manual tasks and enable engineers to move fast without the concern of losing data during their experiments.


19. service level objectives (SLOs), risk analysis to determine which problems to address and which problems to automate.


20. Bachelor's Degree in a technical discipline required.


21. DBA Certifications required: Oracle, SQLServer, Cassandra (2 or more)


21. Cloud, DevOps certifications will be an advantage.


 


Must have Skills:


 


Ø Oracle DBA with development


Ø SQL


Ø Devops tools


Ø Cassandra






Read more
Fintrac Global services

Fintrac Global services

Agency job
via Vmultiply solutions by Mounica Buddharaju
Hyderabad
5 - 8 yrs
₹5L - ₹15L / yr
skill iconPython
Bash
Google Cloud Platform (GCP)
skill iconAmazon Web Services (AWS)
Windows Azure
+2 more

Required Qualifications: 

∙Bachelor’s degree in computer science, Information Technology, or related field, or equivalent experience. 

∙5+ years of experience in a DevOps role, preferably for a SaaS or software company. 

∙Expertise in cloud computing platforms (e.g., AWS, Azure, GCP). 

∙Proficiency in scripting languages (e.g., Python, Bash, Ruby). 

∙Extensive experience with CI/CD tools (e.g., Jenkins, GitLab CI, Travis CI). 

∙Extensive experience with NGINX and similar web servers. 

∙Strong knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes). 

∙Familiarity with infrastructure-as-code tools (e.g. Terraform, CloudFormation). 

∙Ability to work on-call as needed and respond to emergencies in a timely manner. 

∙Experience with high transactional e-commerce platforms.


Preferred Qualifications: 

∙Certifications in cloud computing or DevOps are a plus (e.g., AWS Certified DevOps Engineer, 

Azure DevOps Engineer Expert). 

∙Experience in a high availability, 24x7x365 environment. 

∙Strong collaboration, communication, and interpersonal skills. 

∙Ability to work independently and as part of a team.

Read more
Quarks Technosoft Pvt Ltd
AbhishekRaj Gupta
Posted by AbhishekRaj Gupta
Hyderabad
6 - 9 yrs
₹6L - ₹28L / yr
skill iconNodeJS (Node.js)
DynamoDB
AWS Lambda
skill iconPython

Job Description:-


Design, develop IoT/Cloud-based Typescript/ JavaScript/ Node.JS applications using

Amazon Cloud Computing Services.

Work closely with onsite, offshore, and cross functional teams, Product Management, UI/UX developers, Web and Mobile developers, SQA teams to effectively use technologies to build and deliver high quality and on-time delivery of IoT applications

Bug and issue resolution

Proactively Identify risks and failure modes early in the development lifecycle and develop.

POCs to mitigate the risks early in the program.

Assertive communication and team skills

Primary Skills:

Hands on experience (3+ years) in AWS cloud native environment with work experience in

AWS Lambda, Kinesis, DynamoDB

3+ years’ experience in working with NodeJS, Python, Unit Testing and Git

3+ years in work experience with document, relational or timeseries databases

2+ years in work experience with typescript.

1+ years in IaaS framework like Serverless or CDK with CloudFormation knowledge

Read more
master works
Spandana Bomma
Posted by Spandana Bomma
Hyderabad
3 - 7 yrs
₹6L - ₹15L / yr
skill iconMachine Learning (ML)
skill iconData Science
Computer Vision
recommendation algorithm
Image Processing
+7 more

Job Description-

Responsibilities:

* Work on real-world computer vision problems

* Write robust industry-grade algorithms

* Leverage OpenCV, Python and deep learning frameworks to train models.

* Use Deep Learning technologies such as Keras, Tensorflow, PyTorch etc.

* Develop integrations with various in-house or external microservices.

* Must have experience in deployment practices (Kubernetes, Docker, containerization, etc.) and model compression practices

* Research latest technologies and develop proof of concepts (POCs).

* Build and train state-of-the-art deep learning models to solve Computer Vision related problems, including, but not limited to:

* Segmentation

* Object Detection

* Classification

* Objects Tracking

* Visual Style Transfer

* Generative Adversarial Networks

* Work alongside other researchers and engineers to develop and deploy solutions for challenging real-world problems in the area of Computer Vision

* Develop and plan Computer Vision research projects, in the terms of scope of work including formal definition of research objectives and outcomes

* Provide specialized technical / scientific research to support the organization on different projects for existing and new technologies

Skills:

* Object Detection

* Computer Science

* Image Processing

* Computer Vision

* Deep Learning

* Artificial Intelligence (AI)

* Pattern Recognition

* Machine Learning

* Data Science

* Generative Adversarial Networks (GANs)

* Flask

* SQL

Read more
A fast growing Big Data company

A fast growing Big Data company

Agency job
via Careerconnects by Kumar Narayanan
Noida, Bengaluru (Bangalore), Chennai, Hyderabad
6 - 8 yrs
₹10L - ₹15L / yr
AWS Glue
SQL
skill iconPython
PySpark
Data engineering
+6 more

AWS Glue Developer 

Work Experience: 6 to 8 Years

Work Location:  Noida, Bangalore, Chennai & Hyderabad

Must Have Skills: AWS Glue, DMS, SQL, Python, PySpark, Data integrations and Data Ops, 

Job Reference ID:BT/F21/IND


Job Description:

Design, build and configure applications to meet business process and application requirements.


Responsibilities:

7 years of work experience with ETL, Data Modelling, and Data Architecture Proficient in ETL optimization, designing, coding, and tuning big data processes using Pyspark Extensive experience to build data platforms on AWS using core AWS services Step function, EMR, Lambda, Glue and Athena, Redshift, Postgres, RDS etc and design/develop data engineering solutions. Orchestrate using Airflow.


Technical Experience:

Hands-on experience on developing Data platform and its components Data Lake, cloud Datawarehouse, APIs, Batch and streaming data pipeline Experience with building data pipelines and applications to stream and process large datasets at low latencies.


➢ Enhancements, new development, defect resolution and production support of Big data ETL development using AWS native services.

➢ Create data pipeline architecture by designing and implementing data ingestion solutions.

➢ Integrate data sets using AWS services such as Glue, Lambda functions/ Airflow.

➢ Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, RDS, S3, Athena.

➢ Author ETL processes using Python, Pyspark.

➢ Build Redshift Spectrum direct transformations and data modelling using data in S3.

➢ ETL process monitoring using CloudWatch events.

➢ You will be working in collaboration with other teams. Good communication must.

➢ Must have experience in using AWS services API, AWS CLI and SDK


Professional Attributes:

➢ Experience operating very large data warehouses or data lakes Expert-level skills in writing and optimizing SQL Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technology.

➢ Must have 6+ years of big data ETL experience using Python, S3, Lambda, Dynamo DB, Athena, Glue in AWS environment.

➢ Expertise in S3, RDS, Redshift, Kinesis, EC2 clusters highly desired.


Qualification:

➢ Degree in Computer Science, Computer Engineering or equivalent.


Salary: Commensurate with experience and demonstrated competence

Read more
Hyderabad
2 - 5 yrs
₹6L - ₹16L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+8 more
  • 2 to 5 years of experience (or equivalent understanding of software engineering)
  • Familiar with one backend language (Node, Go, Java, Python)
  • Familiar with Javascript/Typescript and a UI framework
  • Willingness and interest in learning new tech/processes (Airflow, AWS, IaaS, etc.)


Read more
Orbital
Rohini P
Posted by Rohini P
Hyderabad
0 - 4 yrs
₹5L - ₹16L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+7 more
  • 0 to 3 years of experience (or equivalent understanding of software engineering)
  • Familiar with Javascript/Typescript and React.js
  • Familiar with one backend language (Node, Go, Java, Python) and one relational database (any SQL)
  • Willingness and interest in learning new tech/processes (Next.js, Angular, IaaS, etc.)


Read more
InnoMick Technology Pvt Ltd

at InnoMick Technology Pvt Ltd

2 candid answers
Sravani Vadranam
Posted by Sravani Vadranam
Hyderabad
6 - 6 yrs
₹10L - ₹15L / yr
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconMongoDB
+7 more



Position: Technical Architect

Location: Hyderabad

 Experience: 6+ years


Job Summary:

We are looking for an experienced Technical Architect with a strong background in Python, Node.js, and React to lead the design and development of complex and scalable software solutions. The ideal candidate will possess exceptional technical skills, a deep understanding of software architecture principles, and a proven track record of successfully delivering high-quality projects. You should be capable of leading a cross-functional team that's responsible for the full software development life cycle, from conception to deployment with Agile methodologies.

 

 

Responsibilities:

●       Lead the design, development, and deployment of software solutions, ensuring architectural integrity and high performance.

●       Collaborate with cross-functional teams, including developers, designers, and product managers, to define technical requirements and create effective solutions.

●       Provide technical guidance and mentorship to development teams, ensuring best practices and coding standards are followed.

●       Evaluate and recommend appropriate technologies, frameworks, and tools to achieve project goals.

●       Drive continuous improvement by staying updated with industry trends, emerging technologies, and best practices.

●       Conduct code reviews, identify areas of improvement, and promote a culture of excellence in software development.

●       Participate in architectural discussions, making strategic decisions and aligning technical solutions with business objectives.

●       Troubleshoot and resolve complex technical issues, ensuring optimal performance and reliability of software applications.

●       Collaborate with stakeholders to gather and analyze requirements, translating them into technical specifications.

●       Define and enforce architectural patterns, ensuring scalability, security, and maintainability of systems.

●       Lead efforts to refactor and optimize existing codebase, enhancing performance and maintainability.

Qualifications:

●       Bachelor's degree in Computer Science, Software Engineering, or a related field. Master's degree is a plus.

●       Minimum of 8 years of experience in software development with a focus on Python, Node.js, and React.

●       Proven experience as a Technical Architect, leading the design and development of complex software systems.

●       Strong expertise in software architecture principles, design patterns, and best practices.

●       Extensive hands-on experience with Python, Node.js, and React, including designing and implementing scalable applications.

●       Solid understanding of microservices architecture, RESTful APIs, and cloud technologies (AWS, GCP, or Azure).

●       Extensive knowledge of JavaScript, web stacks, libraries, and frameworks.

●       Should create automation test cases and unit test cases (optional)

●       Proficiency in database design, optimization, and data modeling.

●       Experience with DevOps practices, CI/CD pipelines, and containerization (Docker, Kubernetes).

●       Excellent problem-solving skills and the ability to troubleshoot complex technical issues.

●       Strong communication skills, both written and verbal, with the ability to effectively interact with cross-functional teams.

●       Prior experience in mentoring and coaching development teams.

●       Strong leadership qualities with a passion for technology innovation.

●        have experience in using Linux-based development environments using GitHub and CI/CD

●       Atlassian stack (JIRA/Confluence)






 



Read more
Hyderabad
1 - 4 yrs
₹10L - ₹20L / yr
skill iconPython
MS-Excel
PowerBI
SQL
skill iconData Analytics

Company Profile :


Merilytics, an Accordion company is a fast-growing analytics firm offering advanced a and intelligent analytical solutions to clients globally. We combine domain expertise, advanced analytics, and technology to provide robust solutions for clients' business problems. You can find further details about the company at https://merilytics.com.


We partner with our clients in Private Equity, CPG, Retail, Healthcare, Media & Entertainment, Technology, Logistics industries etc. by providing analytical solutions to generate superior returns. We solve clients' business problems by analyzing large amount of data to help guide their Operations, Marketing, Pricing, Customer Strategies, and much more.


Position :


- Business Associate at Merilytics will be working on complex analytical projects and is the primary owner of the work streams involved.


- The Business Associates are expected to lead the team of Business Analysts to deliver robust analytical solutions consistently and mentor the Analysts for professional development.


Location : Hyderabad


Roles and Responsibilities :


The roles and responsibilities of a Business Associate will include the below:


- Proactively provide thought leadership to the team and have complete control on the delivery process of the project.


- Understand the client's point of view and translate it into sound judgment calls in ambiguous analytical situations.


- Highlight potential analytical issues upfront and resolve them independently.


- Synthesizes the analysis and derives insights independently.


- Identify the crux of the client problem and leverage it to draw relevant actionable insights from the analysis/work.


- Ability to manage multiple Analysts and provide customized guidance for individual development.


- Resonate with our five core values - Client First, Excellence, Integrity, Respect and Teamwork.


Pre-requisites and skillsets required to apply for this role :


- Undergraduate degree (B.E/B.Tech.) from tier-1/tier-2 colleges are preferred.


- Should have 2-4 years of experience.


- Strong leadership & proactive communication to coordinate with the project team and other internal stakeholders.


- Ability to use business judgement and a structured approach towards solving complex problems.


- Experience in client-facing/professional services environment is a plus.


- Strong hard skills on analytics tools such as R, Python, SQL, and Excel is a plus.


Why Explore a Career at Merilytics :


- High growth environment: Semi-annual performance management and promotion cycles coupled with a strong meritocratic culture, enables fast track to leadership responsibility.


- Cross Domain Exposure: Interesting and challenging work streams across industries and domains that always keep you excited, motivated, and on your toes.


- Entrepreneurial Environment: Intellectual freedom to make decisions and own them. We expect you to spread your wings and assume larger responsibilities.


- Fun culture and peer group: Non-bureaucratic and fun working environment; Strong peer environment that will challenge you and accelerate your learning curve.


Other benefits for full time employees:


(i) Health and wellness programs that include employee health insurance covering immediate family members and parents, term life insurance for employees, free health camps for employees, discounted health services (including vision, dental) for employee and family members, free doctor's consultations, counselors, etc.


(ii) Corporate Meal card options for ease of use and tax benefits.


(iii) Work dinners, team lunches, company sponsored team outings and celebrations.


(iv) Reimbursement support for travel to the office, as and when promulgated by the Company.


(v) Cab reimbursement for women employees beyond a certain time of the day.


(vi) Robust leave policy to support work-life balance. Specially designed leave structure to support woman employees for maternity and related requests.


(vii) Reward and recognition platform to celebrate professional and personal milestones.


(viii) A positive & transparent work environment including various employee engagement and employee benefit initiatives to support personal and professional learning and development.

Read more
 is a software product company that provides

is a software product company that provides

Agency job
via Dangi Digital Media LLP by jaibir dangi
Hyderabad
6 - 15 yrs
₹11L - ₹15L / yr
skill iconPython
Spark
SQL Azure
Apache Kafka
skill iconMongoDB
+4 more

5+ years of experience designing, developing, validating, and automating ETL processes 3+ years of experience traditional ETL tools such as Visual Studio, SQL Server Management Studio, SSIS, SSAS and SSRS 2+ years of experience with cloud technologies and platforms, such as: Kubernetes, Spark, Kafka, Azure Data Factory, Snowflake, ML Flow, Databricks, Airflow or similar Must have experience with designing and implementing data access layers Must be an expert with SQL/T-SQL and Python Must have experience in Kafka Define and implement data models with various database technologies like MongoDB, CosmosDB, Neo4j, MariaDB and SQL Serve Ingest and publish data from sources and to destinations via an API Exposure to ETL/ELT with using Kafka or Azure Event Hubs with Spark or Databricks is a plus Exposure to healthcare technologies and integrations for FHIR API, HL7 or other HIE protocols is a plus


Skills Required :


Designing, Developing, ETL, Visual Studio, Python, Spark, Kubernetes, Kafka, Azure Data Factory, SQL Server, Airflow, Databricks, T-SQL, MongoDB, CosmosDB, Snowflake, SSIS, SSAS, SSRS, FHIR API, HL7, HIE Protocols

Read more
Brisa Technologies Private Limited
Hyderabad, Bengaluru (Bangalore)
3 - 9 yrs
₹5L - ₹15L / yr
skill iconPython
skill iconC
skill iconC++
Microcontrollers
Storage & Networking

Job Title: System Engineer



Responsibilities include, but are not limited to:

· Understand Mobile SoCs architecture across different Multimedia and Connectivity applications

· Evaluate Memory/Storage architecture on mobile platforms and develop architectures to improve

· Innovate new solutions to complex multi-disciplinary problems by collaborating with other team members

· Identify new mobile workloads that will define memory/storage usage in future products

· Evaluate and present architecture trade-offs impacting memory/storage subsystem performance and power

Successful candidates for this position will have the following:

· Bachelors/Master’s degree in Electronics and/or Computer Engineering or Computer Science or related field

· 4-8 years of work experience in the Mobile ecosystem

· Hands-on experience Power and Performance Analysis

· Good understanding of Mobile SOC system and Android system

· Good understanding and experience on system Benchmarking and multimedia applications

· Good understanding of the Storage subsystem with hands on experience on Host side drivers

Preferred Skills:

· Strong Mobile Platform SW and HW knowledge

· Strong working knowledge of DRAM, NAND, and other memory/storage devices

· Strong understanding of architecture in multimedia and connectivity use cases

· Experience in C++, Python, or other System programming language



Read more
Eitacies Inc

at Eitacies Inc

2 candid answers
Vijay Kumar
Posted by Vijay Kumar
Hyderabad
5 - 10 yrs
₹10L - ₹20L / yr
skill iconKubernetes
Infrastructure management
skill iconDocker
DevOps
CI/CD
+5 more

We are looking for an experienced Sr.Devops Consultant Engineer to join our team. The ideal candidate should have at least 5+ years of experience.


We are retained by a promising startup located in Silicon valley backed by Fortune 50 firm with veterans from firms as Zscaler, Salesforce & Oracle. Founding team has been part of three unicorns and two successful IPO’s in the past and well funded by Dell Technologies and Westwave Capital. The company has been widely recognized as an industry innovator in the Data Privacy, Security space and being built by proven Cybersecurity executives who have successfully built and scaled high growth Security companies and built Privacy programs as executives.

 

Responsibilities:

  • Develop and maintain infrastructure as code using tools like Terraform, CloudFormation, and Ansible
  • Manage and maintain Kubernetes clusters on EKS and EC2 instances
  • Implement and maintain automated CI/CD pipelines for microservices
  • Optimize AWS costs by identifying cost-saving opportunities and implementing cost-effective solutions
  • Implement best security practices for microservices, including vulnerability assessments, SOC2 compliance, and network security
  • Monitor the performance and availability of our cloud infrastructure using observability tools such as Prometheus, Grafana, and Elasticsearch
  • Implement backup and disaster recovery solutions for our microservices and databases
  • Stay up to date with the latest AWS services and technologies and provide recommendations for improving our cloud infrastructure
  • Collaborate with cross-functional teams, including developers, and product managers, to ensure the smooth operation of our cloud infrastructure
  • Experience with large scale system design and scaling services is highly desirable

Requirements:

  • Bachelor's degree in Computer Science, Engineering, or a related field
  • At least 5 years of experience in AWS DevOps and infrastructure engineering
  • Expertise in Kubernetes management, Docker, EKS, EC2, Queues, Python Threads, Celery Optimization, Load balancers, AWS cost optimizations, Elasticsearch, Container management, and observability best practices
  • Experience with SOC2 compliance and vulnerability assessment best practices for microservices
  • Familiarity with AWS services such as S3, RDS, Lambda, and CloudFront
  • Strong scripting skills in languages like Python, Bash, and Go
  • Excellent communication skills and the ability to work in a collaborative team environment
  • Experience with agile development methodologies and DevOps practices
  • AWS certification (e.g. AWS Certified DevOps Engineer, AWS Certified Solutions Architect) is a plus.


Notice period : Can join within a month

Read more
Quadratic Insights
Praveen Kondaveeti
Posted by Praveen Kondaveeti
Hyderabad
7 - 10 yrs
₹15L - ₹24L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+6 more

About Quadratyx:

We are a product-centric insight & automation services company globally. We help the world’s organizations make better & faster decisions using the power of insight & intelligent automation. We build and operationalize their next-gen strategy, through Big Data, Artificial Intelligence, Machine Learning, Unstructured Data Processing and Advanced Analytics. Quadratyx can boast more extensive experience in data sciences & analytics than most other companies in India.

We firmly believe in Excellence Everywhere.


Job Description

Purpose of the Job/ Role:

• As a Technical Lead, your work is a combination of hands-on contribution, customer engagement and technical team management. Overall, you’ll design, architect, deploy and maintain big data solutions.


Key Requisites:

• Expertise in Data structures and algorithms.

• Technical management across the full life cycle of big data (Hadoop) projects from requirement gathering and analysis to platform selection, design of the architecture and deployment.

• Scaling of cloud-based infrastructure.

• Collaborating with business consultants, data scientists, engineers and developers to develop data solutions.

• Led and mentored a team of data engineers.

• Hands-on experience in test-driven development (TDD).

• Expertise in No SQL like Mongo, Cassandra etc, preferred Mongo and strong knowledge of relational databases.

• Good knowledge of Kafka and Spark Streaming internal architecture.

• Good knowledge of any Application Servers.

• Extensive knowledge of big data platforms like Hadoop; Hortonworks etc.

• Knowledge of data ingestion and integration on cloud services such as AWS; Google Cloud; Azure etc. 


Skills/ Competencies Required

Technical Skills

• Strong expertise (9 or more out of 10) in at least one modern programming language, like Python, or Java.

• Clear end-to-end experience in designing, programming, and implementing large software systems.

• Passion and analytical abilities to solve complex problems Soft Skills.

• Always speaking your mind freely.

• Communicating ideas clearly in talking and writing, integrity to never copy or plagiarize intellectual property of others.

• Exercising discretion and independent judgment where needed in performing duties; not needing micro-management, maintaining high professional standards.


Academic Qualifications & Experience Required

Required Educational Qualification & Relevant Experience

• Bachelor’s or Master’s in Computer Science, Computer Engineering, or related discipline from a well-known institute.

• Minimum 7 - 10 years of work experience as a developer in an IT organization (preferably Analytics / Big Data/ Data Science / AI background.

Read more
RandomTrees

at RandomTrees

1 recruiter
Amareswarreddt yaddula
Posted by Amareswarreddt yaddula
Hyderabad
5 - 16 yrs
₹1L - ₹30L / yr
ETL
Informatica
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)
SQL
+3 more

We are #hiring for AWS Data Engineer expert to join our team


Job Title: AWS Data Engineer

Experience: 5 Yrs to 10Yrs

Location: Remote

Notice: Immediate or Max 20 Days

Role: Permanent Role


Skillset: AWS, ETL, SQL, Python, Pyspark, Postgres DB, Dremio.


Job Description:

 Able to develop ETL jobs.

Able to help with data curation/cleanup, data transformation, and building ETL pipelines.

Strong Postgres DB exp and knowledge of Dremio data visualization/semantic layer between DB and the application is a plus.

Sql, Python, and Pyspark is a must.

Communication should be good





Read more
[x]cube LABS

at [x]cube LABS

2 candid answers
1 video
Krishna kandregula
Posted by Krishna kandregula
Hyderabad
2 - 6 yrs
₹8L - ₹20L / yr
ETL
Informatica
Data Warehouse (DWH)
PowerBI
DAX
+12 more
  • Creating and managing ETL/ELT pipelines based on requirements
  • Build PowerBI dashboards and manage datasets needed.
  • Work with stakeholders to identify data structures needed for future and perform any transformations including aggregations.
  • Build data cubes for real-time visualisation needs and CXO dashboards.


Required Tech Skills


  • Microsoft PowerBI & DAX
  • Python, Pandas, PyArrow, Jupyter Noteboks, ApacheSpark
  • Azure Synapse, Azure DataBricks, Azure HDInsight, Azure Data Factory



Read more
Vume Interactive

at Vume Interactive

3 recruiters
Shweta Jaiswal
Posted by Shweta Jaiswal
Bengaluru (Bangalore), Hyderabad
5 - 7 yrs
₹3L - ₹20L / yr
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Windows Azure
+5 more

Key Responsibilities:

  • Work with the development team to plan, execute and monitor deployments
  • Capacity planning for product deployments
  • Adopt best practices for deployment and monitoring systems
  • Ensure the SLAs for performance, up time are met
  • Constantly monitor systems, suggest changes to improve performance and decrease costs.
  • Ensure the highest standards of security



Key Competencies (Functional):

 

  • Proficiency in coding in atleast one scripting language - bash, Python, etc
  • Has personally managed a fleet of servers (> 15)
  • Understand different environments production, deployment and staging
  • Worked in micro service / Service oriented architecture systems
  • Has worked with automated deployment systems – Ansible / Chef / Puppet.
  • Can write MySQL queries
Read more
Monarch Tractors India
Hyderabad
3 - 14 yrs
Best in industry
Graphic Designing
Simulation
skill iconC++
OpenGL
OpenCL
+7 more

Designation: Graphics and Simulation Engineer

Experience: 3-15 Yrs

Position Type: Full Time

Position Location: Hyderabad

 

Description:

We are looking for engineers to work on applied research problems related to computer graphics in autonomous driving of electric tractors. The team works towards creating a universe of farm environments in which tractors can driver around for the purposes of simulation, synthetic data generation for deep learning training, simulation of edges cases and modelling physics.

 

Technical Skills:

● Background in OpenGL, OpenCL, graphics algorithms and optimization is necessary.

● Solid theoretical background in computational geometry and computer graphics is desired. Deep learning background is optional.

● Experience in two view and multi-view geometry.

● Necessary Skills: Python, C++, Boost, OpenGL, OpenCL, Unity3D/Unreal, WebGL, CUDA.

●  Academic experience for freshers in graphics is also preferred.

●  Experienced candidates in Computer Graphics with no prior Deep Learning experience willing to apply their knowledge to vision problems are also encouraged to apply.

● Software development experience on low-power embedded platforms is a plus.

 

 Responsibilities:

● Understanding of engineering principles and a clear understanding of data structures and algorithms.

● Ability to understand, optimize and debug imaging algorithms.

●  Ability to drive a project from conception to completion, research papers to code with disciplined approach to software development on Linux platform

● Demonstrate outstanding ability to perform innovative and significant research in the form of technical papers, thesis, or patents.

● Optimize runtime performance of designed models.

● Deploy models to production and monitor performance and debug inaccuracies and exceptions.

● Communicate and collaborate with team members in India and abroad for the fulfillment of your duties and organizational objectives.

● Thrive in a fast-paced environment and have the ability to own the project end to end with minimum hand holding

●  Learn & adapt new technologies & skillsets

● Work on projects independently with timely delivery & defect free approach.

● Thesis focusing on the above skill set may be given more preference.

 

Read more
Accolite Digital
Nitesh Parab
Posted by Nitesh Parab
Bengaluru (Bangalore), Hyderabad, Gurugram, Delhi, Noida, Ghaziabad, Faridabad
4 - 8 yrs
₹5L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
SSIS
SQL Server Integration Services (SSIS)
+10 more

Job Title: Data Engineer

Job Summary: As a Data Engineer, you will be responsible for designing, building, and maintaining the infrastructure and tools necessary for data collection, storage, processing, and analysis. You will work closely with data scientists and analysts to ensure that data is available, accessible, and in a format that can be easily consumed for business insights.

Responsibilities:

  • Design, build, and maintain data pipelines to collect, store, and process data from various sources.
  • Create and manage data warehousing and data lake solutions.
  • Develop and maintain data processing and data integration tools.
  • Collaborate with data scientists and analysts to design and implement data models and algorithms for data analysis.
  • Optimize and scale existing data infrastructure to ensure it meets the needs of the business.
  • Ensure data quality and integrity across all data sources.
  • Develop and implement best practices for data governance, security, and privacy.
  • Monitor data pipeline performance / Errors and troubleshoot issues as needed.
  • Stay up-to-date with emerging data technologies and best practices.

Requirements:

Bachelor's degree in Computer Science, Information Systems, or a related field.

Experience with ETL tools like Matillion,SSIS,Informatica

Experience with SQL and relational databases such as SQL server, MySQL, PostgreSQL, or Oracle.

Experience in writing complex SQL queries

Strong programming skills in languages such as Python, Java, or Scala.

Experience with data modeling, data warehousing, and data integration.

Strong problem-solving skills and ability to work independently.

Excellent communication and collaboration skills.

Familiarity with big data technologies such as Hadoop, Spark, or Kafka.

Familiarity with data warehouse/Data lake technologies like Snowflake or Databricks

Familiarity with cloud computing platforms such as AWS, Azure, or GCP.

Familiarity with Reporting tools

Teamwork/ growth contribution

  • Helping the team in taking the Interviews and identifying right candidates
  • Adhering to timelines
  • Intime status communication and upfront communication of any risks
  • Tech, train, share knowledge with peers.
  • Good Communication skills
  • Proven abilities to take initiative and be innovative
  • Analytical mind with a problem-solving aptitude

Good to have :

Master's degree in Computer Science, Information Systems, or a related field.

Experience with NoSQL databases such as MongoDB or Cassandra.

Familiarity with data visualization and business intelligence tools such as Tableau or Power BI.

Knowledge of machine learning and statistical modeling techniques.

If you are passionate about data and want to work with a dynamic team of data scientists and analysts, we encourage you to apply for this position.

Read more
Synechron

at Synechron

3 recruiters
Ranjini N
Posted by Ranjini N
Bengaluru (Bangalore), Hyderabad
6 - 10 yrs
₹2L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
skill iconPython
Shell Scripting
+2 more

Position: ETL Developer

Location: Mumbai

Exp.Level: 4+ Yrs

Required Skills:

* Strong scripting knowledge such as: Python and Shell

* Strong relational database skills especially with DB2/Sybase

* Create high quality and optimized stored procedures and queries

* Strong with scripting language such as Python and Unix / K-Shell

* Strong knowledge base of relational database performance and tuning such as: proper use of indices, database statistics/reorgs, de-normalization concepts.

* Familiar with lifecycle of a trade and flows of data in an investment banking operation is a plus.

* Experienced in Agile development process

* Java Knowledge is a big plus but not essential

* Experience in delivery of metrics / reporting in an enterprise environment (e.g. demonstrated experience in BI tools such as Business Objects, Tableau, report design & delivery) is a plus

* Experience on ETL processes and tools such as Informatica is a plus. Real time message processing experience is a big plus.

* Good team player; Integrity & ownership

Read more
Cambridge Technology

at Cambridge Technology

2 recruiters
Muthyala Shirish Kumar
Posted by Muthyala Shirish Kumar
Hyderabad
2 - 15 yrs
₹10L - ₹40L / yr
skill iconData Science
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
Computer Vision
recommendation algorithm
+7 more

From building entire infrastructures or platforms to solving complex IT challenges, Cambridge Technology helps businesses accelerate their digital transformation and become AI-first businesses. With over 20 years of expertise as a technology services company, we enable our customers to stay ahead of the curve by helping them figure out the perfect approach, solutions, and ecosystem for their business. Our experts help customers leverage the right AI, big data, cloud solutions, and intelligent platforms that will help them become and stay relevant in a rapidly changing world.


No Of Positions: 1


Skills required: 

  • The ideal candidate will have a bachelor’s degree in data science, statistics, or a related discipline with 4-6 years of experience, or a master’s degree with 4-6 years of experience. A strong candidate will also possess many of the following characteristics:
  • Strong problem-solving skills with an emphasis on achieving proof-of-concept
  • Knowledge of statistical techniques and concepts (regression, statistical tests, etc.)
  • Knowledge of machine learning and deep learning fundamentals
  • Experience with Python implementations to build ML and deep learning algorithms (e.g., pandas, numpy, sci-kit-learn, Stats Models, Keras, PyTorch, etc.)
  • Experience writing and debugging code in an IDE
  • Experience using managed web services (e.g., AWS, GCP, etc.)
  • Strong analytical and communication skills
  • Curiosity, flexibility, creativity, and a strong tolerance for ambiguity
  • Ability to learn new tools from documentation and internet resources.

Roles and responsibilities :

  • You will work on a small, core team alongside other engineers and business leaders throughout Cambridge with the following responsibilities:
  • Collaborate with client-facing teams to design and build operational AI solutions for client engagements.
  • Identify relevant data sources for data wrangling and EDA
  • Identify model architectures to use for client business needs.
  • Build full-stack data science solutions up to MVP that can be deployed into existing client business processes or scaled up based on clear documentation.
  • Present findings to teammates and key stakeholders in a clear and repeatable manner.

Experience :

2 - 14 Yrs

Read more
Monarch Tractors India
Hyderabad
2 - 8 yrs
Best in industry
skill iconMachine Learning (ML)
skill iconData Science
Algorithms
skill iconPython
skill iconC++
+10 more

Designation: Perception Engineer (3D) 

Experience: 0 years to 8 years 

Position Type: Full Time 

Position Location: Hyderabad 

Compensation: As Per Industry standards 

 

About Monarch: 

At Monarch, we’re leading the digital transformation of farming. Monarch Tractor augments both muscle and mind with fully loaded hardware, software, and service machinery that will spur future generations of farming technologies. 

With our farmer-first mentality, we are building a smart tractor that will enhance (not replace) the existing farm ecosystem, alleviate labor availability, and cost issues, and provide an avenue for competitive organic and beyond farming by providing mechanical solutions to replace harmful chemical solutions. Despite all the cutting-edge technology we will incorporate, our tractor will still plow, still, and haul better than any other tractor in its class. We have all the necessary ingredients to develop, build and scale the Monarch Tractor and digitally transform farming around the world. 

 

Description: 

We are looking for engineers to work on applied research problems related to perception in autonomous driving of electric tractors. The team works on classical and deep learning-based techniques for computer vision. Several problems like SFM, SLAM, 3D Image processing, multiple view geometry etc. Are being solved to deploy on resource constrained hardware. 

 

Technical Skills: 

  • Background in Linear Algebra, Probability and Statistics, graphical algorithms and optimization problems is necessary. 
  • Solid theoretical background in 3D computer vision, computational geometry, SLAM and robot perception is desired. Deep learning background is optional. 
  • Knowledge of some numerical algorithms or libraries among: Bayesian filters, SLAM, Eigen, Boost, g2o, PCL, Open3D, ICP. 
  • Experience in two view and multi-view geometry. 
  • Necessary Skills: Python, C++, Boost, Computer Vision, Robotics, OpenCV. 
  • Academic experience for freshers in Vision for Robotics is preferred.  
  • Experienced candidates in Robotics with no prior Deep Learning experience willing to apply their knowledge to vision problems are also encouraged to apply. 
  • Software development experience on low-power embedded platforms is a plus. 

 

Responsibilities: 

  • Understanding engineering principles and a clear understanding of data structures and algorithms. 
  • Ability to understand, optimize and debug imaging algorithms. 
  • Ability to drive a project from conception to completion, research papers to code with disciplined approach to software development on Linux platform. 
  • Demonstrate outstanding ability to perform innovative and significant research in the form of technical papers, thesis, or patents. 
  • Optimize runtime performance of designed models. 
  • Deploy models to production and monitor performance and debug inaccuracies and exceptions. 
  • Communicate and collaborate with team members in India and abroad for the fulfillment of your duties and organizational objectives. 
  • Thrive in a fast-paced environment and can own the project end to end with minimum hand holding. 
  • Learn & adapt to new technologies & skillsets. 
  • Work on projects independently with timely delivery & defect free approach. 
  • Thesis focusing on the above skill set may be given more preference. 

 

What you will get: 

At Monarch Tractor, you’ll play a key role on a capable, dedicated, high-performing team of rock stars. Our compensation package includes a competitive salary, excellent health benefits commensurate with the role you’ll play in our success.  

 

Read more
Cambridge Technology

at Cambridge Technology

2 recruiters
Muthyala Shirish Kumar
Posted by Muthyala Shirish Kumar
Hyderabad
11 - 20 yrs
₹25L - ₹40L / yr
Engineering Management
skill iconJava
skill iconNodeJS (Node.js)
skill iconPython
skill iconAndroid Development
+1 more

We are looking for a Technical Program Manager with at least 12 years of experience managing the planning, execution, and delivery of complex technical projects or programs. You will ensure that technical projects are completed within agreed-upon timelines, budgets, and quality standards, and shall play a critical role in ensuring the successful delivery of complex technical projects and programs.


No of Positions 1

 

Skills Required

  • 12+ years experience with development background and excellent communication skills
  • Should have strong project management and people skills along with technology skills (primarily web based application development, but can be any)
  • Program Planning: Define the scope and objectives of the program and develop a detailed project plan, including schedules, budgets, and resource requirements.
  • Risk Management: Manage program risks, develops mitigation plans, and communicates risk status to stakeholders.
  • Stakeholder Management: Work closely with stakeholders to ensure that program objectives are aligned with business goals and objectives.
  • Communication: Communicate program status, risks, and issues to stakeholders and senior management.
  • Project Management: Manage the day-to-day \ activities of the project team, ensuring that tasks are completed on time and within budget.
  • Quality Assurance: Ensure that the program meets the required quality standards.
  • Resource Management: Manage program resources, including staffing, budget, and equipment.
  • Change Management: Manage changes to the program scope, schedule, and budget and ensures that stakeholders are informed and aligned with the changes.
  • Technical Expertise: You will have the strong technical expertise to understand the technical aspects of the program and provide guidance to the project team.
  • Continuous Improvement: Continuously monitor and evaluate program performance and identify opportunities for improvement.
  • Collaboration: Collaborate effectively with cross-functional teams, including software engineers, product managers, and other stakeholders. You must be able to build and maintain strong relationships with team members and stakeholders to ensure the project's success.

Roles & Responsibilities:

  • Supervise, plan, and manage the stages of product development, approving the product at each stage.
  • For maximum efficiency, create detailed project plans at each stage of product development.
  • Maintain a steady flow of ideas and solutions during product development initiatives in order to introduce innovation and improve operational efficiency.
  • Keep track of key metrics and create reports to communicate development progress to senior management, product managers, and cross-functional stakeholders.
  • Work with a variety of teams, such as software architects, software engineers, system engineers, developers, and product teams.
  • Determine cross-team dependencies and include them in the program planning process.
  • Determine how to solve technical problems by diagnosing them and suggesting potential solutions.
  • Ensure that product development and delivery can be completed within the product budget and timeframe.
  • Oversee the product deployment process and, if necessary, assist with the integration process.
  • Make necessary changes to product development processes based on performance metrics.
  • Stay current on the latest developments in our product category and industry.
  • Ensure complete compliance with industry standards by documenting all processes and adjusting them as needed.
  • Manage project escalations and assist in the formation of project teams as needed.
  • Maintain and implement project plans within the organization necessary technical programs to assist product management teams.

Experience: 12+Years


Location: Hyderabad

Read more
Xemplar

Xemplar

Agency job
via Fruges IT services by Nishanthi Y
Hyderabad
10 - 15 yrs
₹10L - ₹15L / yr
skill iconGo Programming (Golang)
skill iconRuby on Rails (ROR)
skill iconRuby
skill iconPython
skill iconJava
+7 more

Job Responsibilities

✓ Perform the role of Technical Lead on assigned projects to drive solution design (especially backend) and API services development.

✓ Be the thought leader and champion for above mentioned technologies.

✓ Drive technical analysis for new projects including planning and driving proof-of-concepts, if needed.

✓ Drive tasks related to backend development by providing architectural and technical leadership to mid-tier and database developers.

✓ Conduct peer reviews as the lead into Git to confirmed that developed code meets acceptable standards and guidelines.

✓ Work closely with the rest of the leads, mid-tier development, front-end developers, database developers, etc. to ensure end-to-end integrity of the solution being developed.

✓ Work closely with the rest of the tech leads and senior engineering leadership team to ensure reuse where applicable to increase productivity and throughput.

✓ Conduct technical interview to staff open positions in the backend team.

✓ Delegate work and assignments to team members

✓ Collaborate with their team to identify and fix technical problems

✓ Analyze users' needs and then finding applications to serve them

✓ Drive assigned tasks related to SOC 2 certification and ensure compliance to defined controls for areas under lead’s purview.

✓ Guiding their team through technical issues and challenges

✓ Prepare Technical design documents which would help the team to understand the technical flow

✓ Active participation in customer calls especially discussions related to Technical/Architectural and provide inputs.

 

Required Experience: 

✓ Backend Lead around 14 years of experience

✓ Server less Computing Architecture

✓ NodeJS, MySQL, Jenkins, Python, GitLab Technologies

✓ Good knowledge of AWS Cloud

✓ Full cycle AWS implementation experience

✓ Project experience in development and maintenance support for AWS web service and Cloudbased implementations

✓ Experience leading teams of up to 10 + professionals

Ability to manage multiple tasks and projects in a fast-moving environment 

 

Educational Qualifications: 

Engineering graduate or B. Tech/MCA with relevant major subjects like Computer Science

Read more
Monarch Tractors India
Venkat Ramthirdh
Posted by Venkat Ramthirdh
Hyderabad
8 - 12 yrs
Best in industry
cypress
Selenium
Appium
Test Automation (QA)
Software Testing (QA)
+10 more

Job Description:  

Responsibilities: 

  • Define standards and quality metrics  
  • Putting testing strategies and practices in place 
  • Leading the team of test engineers, training them on functional and non-functional needs 
  • Reporting of status, defining risks contingencies, plans and escalations 
  • Ensure that several testing and validation processes are improved continuously. 
  • Ensure several quality improvement tools like code coverage, memory leaks are part of the development cycle. 

Requirements and skills:  

  • Experience with Linux environments  
  • Experience with Cypress
  • Strong programming skills in Python 
  • Experience in using Selenium Webdriver for test automation 
  • Able to follow the agile process for the whole life cycle of testing. 
  • Expert-level knowledge of Jira bug tracking and planning tool 

 


Read more
Monarch Tractors India
Venkat Ramthirdh
Posted by Venkat Ramthirdh
Hyderabad
0 - 1 yrs
Best in industry
skill iconPython
Data Structures
Algorithms
Linux/Unix

Job Description:  

Responsibilities: 

  • Completing all tasks set by the supervisor and assisting wherever possible.  
  • Observing existing strategies, techniques of coding, debugging, testing and adopting to the same 
  • Ability to maintain composure under pressure 
  • Ability to work in a team. 
  • Good observation skills and a willingness to learn. 

Skills: 

  • Proficiency in data structures and algorithms 
  • Good problem solving and analytical thinking skills 
  • Knowledge of Linux systems 
  • Python coding knowledge 
  • Knowledge of object-oriented programming 
  • Good verbal and written communication skills.  

 

Requisition Raised by:  

Engineering Director 

 

Read more
Monarch Tractors India
Hyderabad
3 - 8 yrs
Best in industry
skill iconPython
Red Hat Linux
Linux/Unix
ROS

About Monarch: 

At Monarch, we’re leading the digital transformation of farming. Monarch Tractor augments both muscle and mind with fully loaded hardware, software, and service machinery that will spur future generations of farming technologies. 

With our farmer-first mentality, we are building a smart tractor that will enhance (not replace) the existing farm ecosystem, alleviate labor availability, and cost issues, and provide an avenue for competitive organic and beyond farming by providing mechanical solutions to replace harmful chemical solutions. Despite all the cutting-edge technology we will incorporate, our tractor will still plow, till, and haul better than any other tractor in its class. We have all the necessary ingredients to develop, build and scale the Monarch Tractor and digitally transform farming around the world. 

What you will do: 

1. Design, implement and deliver custom solutions using the existing robotics framework. 

2. Debug issues, do root-cause analysis and apply fixes. 

3. Design and implement tools to facilitate application development and testing. 

4. Participate in architectural improvements. 

5. Work with team members in deployment and field testing. 

Qualifications: 

1. Bachelor’s degree / Masters in Engineering (ECE or CSE preferred) 

2. Work experience of 3+ years in software programming. 

3. Proficiency in Python programming for Linux based systems. 

4. Full understanding of software engineering. 

 

5. Basic Knowledge of Robot Operating System (ROS) is a plus. 

6. Good understanding of the algorithms and control loops. 

7. Working knowledge of Git: creating, merging branches, cherry-picking commits, examining the diff between two hashes. Advanced Git usage is a plus.  

8. Knowledge of video streaming from edge devices is a plus. 

9. Thrive in a fast-paced environment and have the ability to own the project’s tasks end-to-end with minimum hand-holding  

10. Learn and adapt new technologies & skills. Work on projects independently with timely delivery & defect free approach. 

What you will get: 

At Monarch Tractor, you’ll play a key role on a capable, dedicated, high-performing team of rock stars. Our compensation package includes a competitive salary, excellent health, dental and vision benefits, and company equity commensurate with the role you’ll play in our success.  

 

Read more
Hyderabad
5 - 7 yrs
Best in industry
RTMP
MySQL
Web Realtime Communication (WebRTC)
skill iconPython
Linux/Unix
+1 more

At Monarch, we’re leading the digital transformation of farming. Monarch Tractor augments both muscle and mind with fully loaded hardware, software, and service machinery that will spur future generations of farming technologies.


With our farmer-first mentality, we are building a smart tractor that will enhance (not replace) the existing farm ecosystem, alleviate labor availability, and cost issues, and provide an avenue for competitive organic and beyond farming by providing mechanical solutions to replace harmful chemical solutions. Despite all the cutting-edge technology we will incorporate, our tractor will still plow, till, and haul better than any other tractor in its class. We have all the necessary ingredients to develop, build and scale the Monarch Tractor and digitally transform farming around the world. For more details visit

www.monarchtractor.com

 

Monarch Tractor Electric Tractor

Monarch Tractor is the maker of the world’s first fully electric, driver-optional tractor making sustainable farming economically superior.

www.monarchtractor.com

 

Requirements and skills:

· 3+ Years of software development experience

· Strong in data structures and algorithms

· Solid understanding of Linux development environment and systems

· Expert level knowledge of Python along with frameworks some application framework

· Prior experience with WebRTC and video streaming protocols like RTMP, RTP and payloads

· Prior experience of h264, h265, vp8, vp9 and av1 encoders and decoders

· Prior experience with GStreamer pipelines

· Knowledge of C++ and proxy servers such as NGNIX is an added advantage

· Proficient in writing unit test cases using the Pytest framework.

· Expert level knowledge of SQL databases like MySQL

· Understanding of microservices architecture

· Knowledge of AWS cloud services like EC2, S3, Lambda etc.

Read more
Bullhorn Consultants

at Bullhorn Consultants

4 recruiters
vidya venugopal
Posted by vidya venugopal
Pune, Bengaluru (Bangalore), Hyderabad, Mumbai, Chennai, Delhi
6 - 11 yrs
₹1L - ₹30L / yr
skill iconPython
skill iconAmazon Web Services (AWS)

Job Responsibilities:


Support, maintain, and enhance existing and new product functionality for trading software in a real-time, multi-threaded, multi-tier server architecture environment to create high and low level design for concurrent high throughput, low latency software architecture.


  • Provide software development plans that meet future needs of clients and markets
  • Evolve the new software platform and architecture by introducing new components and integrating them with existing ones
  • Perform memory, cpu and resource management
  • Analyze stack traces, memory profiles and production incident reports from traders and support teams
  • Propose fixes, and enhancements to existing trading systems
  • Adhere to release and sprint planning with the Quality Assurance Group and Project Management
  • Work on a team building new solutions based on requirements and features
  • Attend and participate in daily scrum meetings


Required Skills:

  •  JavaScript and Python
  • Multi-threaded browser and server applications
  • Amazon Web Services (AWS)
  • REST


Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort