11+ SQLAlchemy Jobs in Pune | SQLAlchemy Job openings in Pune
Apply to 11+ SQLAlchemy Jobs in Pune on CutShort.io. Explore the latest SQLAlchemy Job opportunities across top companies like Google, Amazon & Adobe.
Looking for Python lead/architect
Must Have:
- Able to architect a application from scratch.
- Able to refactor code
- Knowledge of Flask, DJango
- Team player
- Able to lead the team and guide them
- Deployment of code on Azure platform
Good to have:
- Knowledge of SqlAlchemy
We are hiring for a Python Developer at Wissen Technology!
📍 Location: Pune (Hybrid)
💼 Experience: 3–6 Years
⏱️ Notice Period: Immediate / 15 days preferred
🔧 Key Skills:
• Strong experience in Python
• Hands-on with Pandas & NumPy
• Experience with AWS (S3, Lambda preferred)
• Good understanding of data processing & APIs
• SQL knowledge
🏢 About Wissen Technology:
Wissen Technology, part of the Wissen Group (est. 2000), is a fast-growing technology company specializing in high-end consulting across Banking, Finance, Telecom, and Healthcare domains.
✔️ Global presence – US, India, UK, Australia, Mexico & Canada
✔️ Certified Great Place to Work®
✔️ Trusted by Fortune 500 clients like Morgan Stanley, Goldman Sachs, and more
✔️ Strong growth with 400% revenue increase in recent years
🌐 Website: www.wissen.com
🔗 LinkedIn: https://www.linkedin.com/company/wissen-technology/
If you’re interested or have relevant candidates, please share your resume at [your email].
#Hiring #PythonDeveloper #PuneJobs #AWS #ImmediateJoiner
While you may already know about Wissen and the company history, here is a quick rundown for you.
About Wissen Technology:
· The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015.
· Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains. We help clients build world class products.
· Our workforce has highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.
· Wissen Technology has grown its revenues by 400% in these five years without any external funding or investments.
· Globally present with offices US, India, UK, Australia, Mexico, and Canada.
· We offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
· Wissen Technology has been certified as a Great Place to Work®.
· Wissen Technology has been voted as the Top 20 AI/ML vendor by CIO Insider in 2020.
· Over the years, Wissen Group has successfully delivered $650 million worth of projects for more than 20 of the Fortune 500 companies.
· We have served client across sectors like Banking, Telecom, Healthcare, Manufacturing, and Energy. They include likes of Morgan Stanley, Goldman Sachs, MSCI, StateStreet, Flipkart, Swiggy, Trafigura, GE to name a few.
De
Job Title: Application Development Engineer (Python – Backtesting & Index Platforms)
Role Overview
Key Responsibilities
Engine Development: Design and implement modular, reusable Python components for index construction, rebalancing, and backtesting.
Large-Scale Simulation: Use Pandas, NumPy, and PySpark to run historical calculations across long time horizons and multiple index variants.
Workflow Integration: Integrate engines with orchestrators such as Airflow or Temporal using parameterized, config-driven execution.
Reference Data Consumption: Query and utilize pricing, security master, and corporate action data from Snowflake.
Quality & Reconciliation: Build automated test harnesses to validate outputs, compare against benchmarks, and guarantee reproducibility.
Performance Optimization: Improve runtime efficiency through vectorization, caching, and distributed computing patterns.
Cross-Team Collaboration: Partner with Business, Index Ops, and Platform teams to accelerate research-to-production onboarding.
Required Technical Capabilities
Python Expertise: Strong proficiency in Python application development with emphasis on clean architecture and maintainable design.
Data & Numerical Libraries: Deep experience with Pandas and NumPy; working knowledge of PySpark for distributed workloads.
Financial Computation: Ability to implement portfolio mathematics, weighting algorithms, and time-series transformations.
Config-Driven Systems: Experience building rule-based or metadata-driven processing frameworks.
Database Skills: Strong SQL and experience consuming structured data from Snowflake.
Testing Discipline: Expertise in unit testing, regression testing, and deterministic replay of calculations.
Orchestration Integration: Familiarity with Airflow, Temporal, or similar workflow engines.
Cloud Infrastructure: Solid understanding of AWS ecosystem services (S3, Lambda, IAM) and how they integrate with the Snowflake Data Cloud.
* Python (3 to 6 years): Strong expertise in data workflows and automation
* Spark (PySpark): Hands-on experience with large-scale data processing
* Pandas: For detailed data analysis and validation
* Delta Lake: Managing structured and semi-structured datasets at scale
* SQL: Querying and performing operations on Delta tables
* Azure Cloud: Compute and storage services
* Orchestrator: Good experience with either ADF or Airflow
🚀 We’re Hiring: Senior Python Backend Developer 🚀
📍 Location: Baner, Pune (Work from Office)
💰 Compensation: ₹6 LPA
🕑 Experience Required: Minimum 2 years as a Python Backend Developer
About Us
Foto Owl AI is a fast-growing product-based company headquartered in Baner, Pune.
We specialize in:
⚡ Hyper-personalized fan engagement
🤖 AI-powered real-time photo sharing
📸 Advanced media asset management
What You’ll Do
As a Senior Python Backend Developer, you’ll play a key role in designing, building, and deploying scalable backend systems that power our cutting-edge platforms.
Architect and develop complex, secure, and scalable backend services
Build and maintain APIs & data pipelines for web, mobile, and AI-driven platforms
Optimize SQL & NoSQL databases for high performance
Manage AWS infrastructure (EC2, S3, RDS, Lambda, CloudWatch, etc.)
Implement observability, monitoring, and security best practices
Collaborate cross-functionally with product & AI teams
Mentor junior developers and conduct code reviews
Troubleshoot and resolve production issues with efficiency
What We’re Looking For
✅ Strong expertise in Python backend development
✅ Solid knowledge of Data Structures & Algorithms
✅ Hands-on experience with SQL (PostgreSQL/MySQL) and NoSQL (MongoDB, Redis, etc.)
✅ Proficiency in RESTful APIs & Microservice design
✅ Knowledge of Docker, Kubernetes, and cloud-native systems
✅ Experience managing AWS-based deployments
Why Join Us?
At Foto Owl AI, you’ll be part of a passionate team building world-class media tech products used in sports, events, and fan engagement platforms. If you love scalable backend systems, real-time challenges, and AI-driven products, this is the place for you.
We are seeking a highly skilled and motivated Python Developer with hands-on experience in AWS cloud services (Lambda, API Gateway, EC2), microservices architecture, PostgreSQL, and Docker. The ideal candidate will be responsible for designing, developing, deploying, and maintaining scalable backend services and APIs, with a strong emphasis on cloud-native solutions and containerized environments.
Key Responsibilities:
- Develop and maintain scalable backend services using Python (Flask, FastAPI, or Django).
- Design and deploy serverless applications using AWS Lambda and API Gateway.
- Build and manage RESTful APIs and microservices.
- Implement CI/CD pipelines for efficient and secure deployments.
- Work with Docker to containerize applications and manage container lifecycles.
- Develop and manage infrastructure on AWS (including EC2, IAM, S3, and other related services).
- Design efficient database schemas and write optimized SQL queries for PostgreSQL.
- Collaborate with DevOps, front-end developers, and product managers for end-to-end delivery.
- Write unit, integration, and performance tests to ensure code reliability and robustness.
- Monitor, troubleshoot, and optimize application performance in production environments.
Required Skills:
- Strong proficiency in Python and Python-based web frameworks.
- Experience with AWS services: Lambda, API Gateway, EC2, S3, CloudWatch.
- Sound knowledge of microservices architecture and asynchronous programming.
- Proficiency with PostgreSQL, including schema design and query optimization.
- Hands-on experience with Docker and containerized deployments.
- Understanding of CI/CD practices and tools like GitHub Actions, Jenkins, or CodePipeline.
- Familiarity with API documentation tools (Swagger/OpenAPI).
- Version control with Git.
1. Should have worked in Agile methodology and microservices architecture
2. Should have 7+ years of experience in Python and Django framework
3. Should have a good knowledge of DRF
4. Should have knowledge of User Auth (JWT, OAuth2), API Auth, Access Control List, etc.
5. Should have working experience in session management in Django
6. Should have expertise in the Django MVC and uses of templates in frontend
7. Should have working experience in PostgreSQL
8. Should have working experience in the RabbitMQ messaging channel and Celery Analytics
9. Good to have javascript implementation knowledge in Django templates
· The Objective:
You will play a crucial role in designing, implementing, and maintaining our data infrastructure, run tests and update the systems
· Job function and requirements
o Expert in Python, Pandas and Numpy with knowledge of Python web Framework such as Django and Flask.
o Able to integrate multiple data sources and databases into one system.
o Basic understanding of frontend technologies like HTML, CSS, JavaScript.
o Able to build data pipelines.
o Strong unit test and debugging skills.
o Understanding of fundamental design principles behind a scalable application
o Good understanding of RDBMS databases among Mysql or Postgresql.
o Able to analyze and transform raw data.
· About us
Mitibase helps companies find warm prospects every month that are most relevant, and then helps their team to act on those with automation. We do so by automatically tracking key accounts and contacts for job changes and relationships triggers and surfaces them as warm leads in your sales pipeline.
- Designing scalable systems for high load.
- Defining and improving the development processes which includes implementation and Quality Assurance as well
- Architecting complex scalable systems with a keen eye towards performance, security and availability while also taking on a super hands-on role towards implementation
- Spearheading all inbound and outbound API integrations to build the most robust and scalable integration platform in the B2B Retail space.
- Working on interesting technical challenges in a product centric and open-source driven environment.
- Using open source as much as possible, and blogging about cool things that you learnt and built.
What you need to have:
- B.Tech /B.E.; Any Graduation
- Strong relational DB experience preferred
- Must be very much in touch with backend coding and want to do it everyday
- Our stack is primarily built around Node (Loopback), Mongo and ElasticSearch.
- Deep familiarity with Git and basic working knowledge of DevOps (Server and DB config, Docker, Kubernetes etc) is strongly preferred.
- Deep knowledge of NodeJS, PHP, MongoDB and MySQL.
- The role requires a good knowledge of Algorithmic Design and Architecture, Data structures, OOPS Concepts, serverless architectures and complex problem solving skills.
- You will help set a very high bar on code quality.
- We have started the transition towards micro services, one of your core responsibilities is ensure micro services are used wherever it makes sense
Looking Data Enginner for our OWn organization-
Notice Period- 15-30 days
CTC- upto 15 lpa
Preferred Technical Expertise
- Expertise in Python programming.
- Proficient in Pandas/Numpy Libraries.
- Experience with Django framework and API Development.
- Proficient in writing complex queries using SQL
- Hands on experience with Apache Airflow.
- Experience with source code versioning tools such as GIT, Bitbucket etc.
Good to have Skills:
- Create and maintain Optimal Data Pipeline Architecture
- Experienced in handling large structured data.
- Demonstrated ability in solutions covering data ingestion, data cleansing, ETL, Data mart creation and exposing data for consumers.
- Experience with any cloud platform (GCP is a plus)
- Experience with JQuery, HTML, Javascript, CSS is a plus.





