o Strong Python development skills, with 7+ yrs. experience with SQL.
o A bachelor or master’s degree in Computer Science or related areas
o8+ years of experience in data integration and pipeline development
o Experience in Implementing Databricks Delta lake and data lake
o Expertise designing and implementing data pipelines using modern data engineering approach and tools: SQL, Python, Delta Lake, Databricks, Snowflake Spark
o Experience in working with multiple file formats (Parque, Avro, Delta Lake) & API
o experience with AWS Cloud on data integration with S3.
o Hands on Development experience with Python and/or Scala.
o Experience with SQL and NoSQL databases.
o Experience in using data modeling techniques and tools (focused on Dimensional design)
o Experience with micro-service architecture using Docker and Kubernetes
o Have experience working with one or more of the public cloud providers i.e. AWS, Azure or GCP
o Experience in effectively presenting and summarizing complex data to diverse audiences through visualizations and other means
o Excellent verbal and written communications skills and strong leadership capabilities
Skills:
Python

About CES Information Technologies
About
Connect with the team
Company social profiles
Similar jobs
Job Title : Python Developer – API Integration & AWS Deployment
Experience : 5+ Years
Location : Bangalore
Work Mode : Onsite
Job Overview :
We are seeking an experienced Python Developer with strong expertise in API development and AWS cloud deployment.
The ideal candidate will be responsible for building scalable RESTful APIs, automating power system simulations using PSS®E (psspy), and deploying automation workflows securely and efficiently on AWS.
Mandatory Skills : Python, FastAPI/Flask, PSS®E (psspy), RESTful API Development, AWS (EC2, Lambda, S3, EFS, API Gateway), AWS IAM, CloudWatch.
Key Responsibilities :
Python Development & API Integration :
- Design, build, and maintain RESTful APIs using FastAPI or Flask to interface with PSS®E.
- Automate simulations and workflows using the PSS®E Python API (psspy).
- Implement robust bulk case processing, result extraction, and automated reporting systems.
AWS Cloud Deployment :
- Deploy APIs and automation pipelines using AWS services such as EC2, Lambda, S3, EFS, and API Gateway.
- Apply cloud-native best practices to ensure reliability, scalability, and cost efficiency.
- Manage secure access control using AWS IAM, API keys, and implement monitoring using CloudWatch.
Required Skills :
- 5+ Years of professional experience in Python development.
- Hands-on experience with RESTful API development (FastAPI/Flask).
- Solid experience working with PSS®E and its psspy Python API.
- Strong understanding of AWS services, deployment, and best practices.
- Proficiency in automation, scripting, and report generation.
- Knowledge of cloud security and monitoring tools like IAM and CloudWatch.
Good to Have :
- Experience in power system simulation and electrical engineering concepts.
- Familiarity with CI/CD tools for AWS deployments.
Job Title: Backend Engineer – Python / Golang / Rust
Location: Bangalore, India
Experience Required: Minimum 2–3 years
About the Role
We are looking for a passionate Backend Engineer to join our growing engineering team. The ideal candidate should have hands-on experience in building enterprise-grade, scalable backend systems using microservices architecture. You will work closely with product, frontend, and DevOps teams to design, develop, and optimize robust backend solutions that can handle high traffic and ensure system reliability.
Key Responsibilities
• Design, develop, and maintain scalable backend services and APIs.
• Architect and implement microservices-based systems ensuring modularity and resilience.
• Optimize application performance, database queries, and service scalability.
• Collaborate with frontend engineers, product managers, and DevOps teams for seamless delivery.
• Implement security best practices and ensure data protection compliance.
• Write and maintain unit tests, integration tests, and documentation.
• Participate in code reviews, technical discussions, and architecture design sessions.
• Monitor, debug, and improve system performance in production environments.
Required Skills & Experience
• Programming Expertise:
• Advanced proficiency in Python (Django, FastAPI, or Flask), OR
• Strong experience in Golang or Rust for backend development.
• Microservices Architecture: Hands-on experience in designing and maintaining distributed systems.
• Database Management: Expertise in PostgreSQL, MySQL, MongoDB, including schema design and optimization.
• API Development: Strong experience in RESTful APIs and GraphQL.
• Cloud Platforms: Proficiency with AWS, GCP, or Azure for deployment and scaling.
• Containerization & Orchestration: Solid knowledge of Docker and Kubernetes.
• Messaging & Caching: Experience with Redis, RabbitMQ, Kafka, and caching strategies (Redis, Memcached).
• Version Control: Strong Git workflows and collaboration in team environments.
• Familiarity with CI/CD pipelines, DevOps practices, and cloud-native deployments.
• Proven experience working on production-grade, high-traffic applications.
Preferred Qualifications
• Understanding of software architecture patterns (event-driven, CQRS, hexagonal, etc.).
• Experience with Agile/Scrum methodologies.
• Contributions to open-source projects or strong personal backend projects.
• Experience with observability tools (Prometheus, Grafana, ELK, Jaeger).
Why Join Us?
• Work on cutting-edge backend systems that power enterprise-grade applications.
• Opportunity to learn and grow with a fast-paced engineering team.
• Exposure to cloud-native, microservices-based architectures.
• Collaborative culture that values innovation, ownership, and technical excellence.
Job Details:
As a Software engineer you will be able to challenge the idea of “impossible”, producing results that are elegant, simple and don’t require a team of experts to decode. You are driven by innovation, fresh ideas and new ways to produce high quality solutions.
Job Description:
Position Summary:
We are looking for a Cloud developer responsible for the development and maintenance of cloud applications deployed in AWS environment. Your primary focus will be the development of such applications and their integration with other services. A commitment to open mind, problem solving, ability to learn, and creating quality products is essential.
Responsibilities:
- Ensure the performance, quality, and responsiveness of services
- Collaborate with a team to define, design, and ship new features
- Innovative thinking of finding solutions to needs
- Identify and correct bottlenecks and fix bugs
- Help maintain code quality, automatization and documentation
- Use Agile Scrum Methodology for software development
- Develop unit tests for all new code
- Provide code reviews for all new code and participate to code reviews of other people
- Diagnose and resolve complex level issues of application
- Participate in interactions with all levels of personnel with different teams
- Design and build services on top of AWS
Skills:
- Strong knowledge of Python
- Strong knowledge of Web Services (Rest or SOAP APIs)
- Strong knowledge of React JS or any other JavaScript
- Solid understanding of object-oriented programming
- Knowledge of Java and Spring Boot is good to have
- Knowledge of AWS is good to have
- Knowledge of TypeScript is good to have
- Knowledge of Linux is good to have
- Knowledge of HTML and CSS is good to have
- Knowledge of AWS CloudFormation is good to have
- Knowledge of Elasticsearch is good to have
- Familiarity with continuous integration
- Any authorized Java, AWS, Linux, or Python certifications will be value added
- Min 2 years of work experience in relevant technologies
- Excellent interpersonal and written communication skills
- Solid understanding of Data structures and Algorithms.
- Exceptional coding skills in an Object-Oriented programming language (Golang/Python)
- Must have basic understanding of AWS (EC2, Lambda, Boto, CI/CD), Celery, RabbitMq and similar task queue management tools/libraries.
- Experience with web technologies Python, Linux, Apache, Solr, Memcache, Redis, grpc
- Experience with high performance services catering to millions of daily traffic is a plus
- Strong understanding of Python and Django.
- Good knowledge of various Python Libraries, APIs, and tool kits.
- Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3.
- Proficient understanding of code versioning tools such as Git.
- Understanding of the threading limitations of Python, and multi-process architecture
- Understanding of databases and MySQL
Responsibilities :
- Comply with coding standards and technical design.
- Adapts structured coding styles for easy review, testing, and maintainability of the code.
- Active participation in troubleshooting and debugging.
- Preparing technical documentation of code.
Hi,
Enterprise Minds is hiring Java Developer for Pune Location.
Title: Sr.java Developer
Location : Pune
Exp : 7+ Years
- Strong Java background and backend Database experience
Must have strong exp in SQL
Experience should include working on an Enterprise level product.
- Should have knowledge in Apache Calcite
Position: Lead Backend Engineer
Location: Pune, India (Initially Remote Due to COVID 19)
About the Organization:
It is one of the most exciting Bay Area Product Driven Organization in Logistics industry supporting more than 30,0000 customers.
As a software engineer on the Backend team, you will be building and owning backend services and infrastructure that power our core products. We strive for high standard of engineering quality while solving unique hardware as well as software challenges. You will have high impact roles at a relatively small company for our user base. You thrive in fast-paced, unstructured environments that require you to wear many hats and think on your feet.
What we are looking for?
- 4+ years of software engineering experience
- 2+ year of experience in Python/Django or Go
- Familiarity with Node.JS is a plus
- You write high quality and well-tested code to meet the needs of your customers.
- Good intuition for REST API design
- Start-up experience is a plus
- BS/MS/PHD in Computer Science or a related field (ideal)
- Preferably a Pune based candidate who can join at the earliest / Within a month
Should Be a Computer Science Graduate from a Tier-1 Engineering college like IIT/ BHU / NIT / VIT / COEP / PICT / BITS
Why join us?
- Very high-growth
- Passionate, collaborative, and awesome co-workers
- Free lunches
- Competitive salaries
1: proficient in python, flask, pandas, GitHub and AWS
2: good knowledge of databases both SQL and NoSQL
3:Strong experience in REST and SOAP APIs
4: Experience with working on scalable interactive web applications
5:Basic knowledge of JavaScript and Html
6: Automation and crawling tools and modules
7: Multithreading and Multiprocessing
8:Good Understanding of test-driven Development
9: Preferred exposure to finance domain
- B Tech/BE or M.Tech/ME in Computer Science or equivalent from a reputed college.
- Experience level of 7+ years in building large scale applications.
- Strong problem solving skills, data structures and algorithms.
- Experience with distributed systems handling large amount of data.
- Excellent coding skills in Java / Python / Node / Go.
- Very good understanding of Web Technologies.
- Very good understanding of any RDBMS and/or messaging.
We are actively seeking software development engineers who are interested in designing robust trading systems and refining programs to efficiently manage various types of financial market data that facilitate our quantitative investment research. By designing and improving the firm's internal applications, the SDE will play a key role in expanding the firm's trading capabilities.
Responsibilities:
- Management & scaling up existing infrastructure for high-frequency market data capture.
- Develop a scalable and consistent data handling infrastructure for the above data to facilitate efficient backtesting of quantitative investment strategies.
- Perform R& D; to build a software platform in Python for backtesting various kind of investment strategies using the above databases.
- This will involve studying the strategy development process and performance evaluation metrics.
- Develop autopilot risk-management systems to monitor live performance of the Portfolio.
- Improve the existing algorithms to achieve better execution price and reduce the latency.
Requirements:
Our ideal candidate would have graduated with a degree in computer science from a top university with 1-3 years industry experience, along with:
- High Level of proficiency in Python and good knowledge of Matlab/C++/C#.
- Past experience in dealing with large datasets and Knowledge of database administration and network programming will be a plus.
- Well-versed in software engineering principles, frameworks and technologies.
- The ability to manage multiple tasks in a fast-paced environment.
- Excellent analytical and problem solving abilities.
- A keen interest in learning about the financial markets.
We, the Products team at DataWeave, build data products that provide timely insights that are readily consumable and actionable, at scale. Our underpinnings are: scale, impact, engagement, and visibility. We help
businesses take data driven decisions everyday. We also give them insights for long term strategy. We are focused on creating value for our customers and help them succeed.
How we work
It's hard to tell what we love more, problems or solutions! Every day, we choose to address some of the hardest data problems that there are. We are in the business of making sense of messy public data on the web. At
serious scale! Read more on Become a DataWeaver
What do we offer?
- Opportunity to work on some of the most compelling data products that we are building for online retailers and brands.
- Ability to see the impact of your work and the value you are adding to our customers almost immediately.
- Opportunity to work on a variety of challenging problems and technologies to figure out what really excites you.
- A culture of openness. Fun work environment. A flat hierarchy. Organization wide visibility. Flexible working hours.
- Learning opportunities with courses, trainings, and tech conferences. Mentorship from seniors in the team.
- Last but not the least, competitive salary packages and fast paced growth opportunities.
Roles and Responsibilities:
● Build a low latency serving layer that powers DataWeave's Dashboards, Reports, and Analytics
functionality
● Build robust RESTful APIs that serve data and insights to DataWeave and other products
● Design user interaction workflows on our products and integrating them with data APIs
● Help stabilize and scale our existing systems. Help design the next generation systems.
● Scale our back end data and analytics pipeline to handle increasingly large amounts of data.
● Work closely with the Head of Products and UX designers to understand the product vision and design
philosophy
● Lead/be a part of all major tech decisions. Bring in best practices. Mentor younger team members and
interns.
● Constantly think scale, think automation. Measure everything. Optimize proactively.
● Be a tech thought leader. Add passion and vibrancy to the team. Push the envelope.
Skills and Requirements:
● 5-7 years of experience building and scaling APIs and web applications.
● Experience building and managing large scale data/analytics systems.
● Have a strong grasp of CS fundamentals and excellent problem solving abilities. Have a good understanding of software design principles and architectural best practices.
● Be passionate about writing code and have experience coding in multiple languages, including at least one scripting language, preferably Python.
● Be able to argue convincingly why feature X of language Y rocks/sucks, or why a certain design decision is right/wrong, and so on.
● Be a self-starter—someone who thrives in fast paced environments with minimal ‘management’.
● Have experience working with multiple storage and indexing technologies such as MySQL, Redis, MongoDB, Cassandra, Elastic.
● Good knowledge (including internals) of messaging systems such as Kafka and RabbitMQ.
● Use the command line like a pro. Be proficient in Git and other essential software development tools.
● Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus.
● Exposure to one or more centralized logging, monitoring, and instrumentation tools, such as Kibana, Graylog, StatsD, Datadog etc.
● Working knowledge of building websites and apps. Good understanding of integration complexities and dependencies.
● Working knowledge linux server administration as well as the AWS ecosystem is desirable.
● It's a huge bonus if you have some personal projects (including open source contributions) that you work on during your spare time. Show off some of your projects you have hosted on GitHub.










