11+ Business Writing Jobs in Pune | Business Writing Job openings in Pune
Apply to 11+ Business Writing Jobs in Pune on CutShort.io. Explore the latest Business Writing Job opportunities across top companies like Google, Amazon & Adobe.
Job Overview:
We are looking for a skilled Senior Backend Engineer to join our team. The ideal candidate will have a strong foundation in Java and Spring, with proven experience in building scalable microservices and backend systems. This role also requires familiarity with automation tools, Python development, and working knowledge of AI technologies.
Responsibilities:
- Design, develop, and maintain backend services and microservices.
- Build and integrate RESTful APIs across distributed systems.
- Ensure performance, scalability, and reliability of backend systems.
- Collaborate with cross-functional teams and participate in agile development.
- Deploy and maintain applications on AWS cloud infrastructure.
- Contribute to automation initiatives and AI/ML feature integration.
- Write clean, testable, and maintainable code following best practices.
- Participate in code reviews and technical discussions.
Required Skills:
- 4+ years of backend development experience.
- Strong proficiency in Java and Spring/Spring Boot frameworks.
- Solid understanding of microservices architecture.
- Experience with REST APIs, CI/CD, and debugging complex systems.
- Proficient in AWS services such as EC2, Lambda, S3.
- Strong analytical and problem-solving skills.
- Excellent communication in English (written and verbal).
Good to Have:
- Experience with automation tools like Workato or similar.
- Hands-on experience with Python development.
- Familiarity with AI/ML features or API integrations.
- Comfortable working with US-based teams (flexible hours).
About Vijay Sales
Vijay Sales is one of India’s leading electronics retail brands with 160+ stores nationwide and a fast-growing digital presence. We are on a mission to build the most advanced data-driven retail intelligence ecosystem—using AI, predictive analytics, LLMs, and real-time automation to transform customer experience, supply chain, and omnichannel operations.
Role Overview
We are looking for a highly capable AI Engineer who is passionate about building production-grade AI systems, designing scalable ML architecture, and working with cutting-edge AI/ML tools. This role involves hands-on work with Databricks, SQL, PySpark, modern LLM/GenAI frameworks, and full lifecycle ML system design.
Key Responsibilities
Machine Learning & AI Development
- Build, train, and optimize ML models for forecasting, recommendation, personalization, churn prediction, inventory optimization, anomaly detection, and pricing intelligence.
- Develop GenAI solutions using modern LLM frameworks (e.g., LangChain, LlamaIndex, HuggingFace Transformers).
- Explore and implement RAG (Retrieval Augmented Generation) pipelines for product search, customer assistance, and support automation.
- Fine-tune LLMs on company-specific product and sales datasets (using QLoRA, PEFT, and Transformers).
- Develop scalable feature engineering pipelines leveraging Delta Lake and Databricks Feature Store.
Databricks / Data Engineering
- Build end-to-end ML workflows on Databricks using PySpark, MLflow, Unity Catalog, Delta Live Tables.
- Optimize Databricks clusters for cost, speed, and stability.
- Maintain reusable notebooks and parameterized pipelines for model ingestion, validation, and deployment.
- Use MLflow for tracking experiments, model registry, and lifecycle management.
Data Handling & SQL
- Write advanced SQL for multi-source data exploration, aggregation, and anomaly detection.
- Work on large, complex datasets from ERP, POS, CRM, Website, and Supply Chain systems.
- Automate ingestion of streaming and batch data into Databricks pipelines.
Deployment & MLOps
- Deploy ML models using REST APIs, Databricks Model Serving, Docker, or cloud-native endpoints.
- Build CI/CD pipelines for ML using GitHub Actions, Azure DevOps, or Databricks Workflows.
- Implement model monitoring for drift, accuracy decay, and real-time alerts.
- Maintain GPU/CPU environments for training workflows.
Must-Have Technical Skills
Core AI/ML
- Strong fundamentals in machine learning: regression, classification, time-series forecasting, clustering.
- Experience in deep learning using PyTorch or TensorFlow/Keras.
- Expertise in LLMs, embeddings, vector databases, and GenAI architecture.
- Hands-on experience with HuggingFace, embedding models, and RAG.
Databricks & Big Data
- Hands-on experience with Databricks (PySpark, SQL, Delta Lake, MLflow, Feature Store).
- Strong understanding of Spark execution, partitioning, and optimization.
Programming
- Strong proficiency in Python.
- Experience writing high-performance SQL with window functions, CTEs, and analytical queries.
- Knowledge of Git, CI/CD, REST APIs, and Docker.
MLOps & Production Engineering
- Experience deploying models to production and monitoring them.
- Familiarity with tools like MLflow, Weights & Biases, or SageMaker equivalents.
- Experience in building automated training pipelines and handling model drift/feedback loops.
Preferred Domain Experience
- Retail/e-commerce analytics
- Demand forecasting
- Inventory optimization
- Customer segmentation & personalization
- Price elasticity and competitive pricing

Lead Frontend Architect (Vue.js & Firebase)
Amplifai transforms AI potential into measurable business value, guiding organizations from strategic planning to execution. With deep expertise in AI product development, technical architecture, regulatory compliance, and commercialization, we deliver secure, ethical, and high-performing solutions. Having co-founded one of Europe’s most innovative AI companies, our team drives unparalleled growth for clients through cutting-edge technologies like GPT tools, AI agents, and modern frameworks. Join our new Pune office to shape the future of AI-driven innovation!
One of our partners is transforming how the construction industry measures and manages carbon emissions, helping organizations meet their sustainability goals with accurate, scalable, and actionable insights. Their SaaS platform enables carbon footprint calculations, Life Cycle Assessment (LCA) data management, and complex environmental reporting — and we’re ready to take it from 70 customers to 700+ enterprise clients.
We’re seeking a Senior Cloud Architect & Tech Lead to spearhead the next phase of our platform’s growth. You’ll lead architectural decisions for a complex sustainability and carbon accounting platform built on Firebase/Google Cloud with a Vue.js frontend, driving scalability, enterprise readiness, and technical excellence. This is a hands-on leadership role where you’ll guide the engineering team, optimize system performance, and shape a long-term technical roadmap to support 10x growth — all while leveraging cutting-edge GenAI developer tools like Cursor, Claude, Lovable, and GitHub Copilot to accelerate delivery and innovation.
Key Responsibilities:
· Lead architecture design for a highly scalable, enterprise-ready SaaS platform built with Vue.js, Firebase Functions (Node.js), Firestore, Redis, and GenKit AI.
· Design and optimize complex hierarchical data models and computational workloads for high performance at scale.
· Evaluate platform evolution options — from deep Firebase optimizations to potential migration strategies — balancing technical debt, scalability, and enterprise needs.
· Implement SOC2/ISO27001-ready security controls including audit logging, data encryption, and enterprise-grade access management.
· Drive performance engineering to address Firestore fan-out queries, function cold starts, and database scaling bottlenecks.
· Oversee CI/CD automation and deployment pipelines for multi-environment enterprise releases.
· Design APIs and integration strategies to meet enterprise customer requirements and enable global scaling.
· Mentor and guide the development team, ensuring technical quality, scalability, and adoption of best practices.
· Collaborate cross-functionally with product managers, sustainability experts, and customer success teams to deliver impactful features and integrations.
· Plan and execute disaster recovery strategies, business continuity procedures, and cost-optimized infrastructure scaling.
· Maintain comprehensive technical documentation for architecture, processes, and security controls.
Required Skills & Experience:
· 5+ years of Google Cloud Platform experience with deep expertise in the Firebase ecosystem.
· Proven ability to scale SaaS platforms through 5–10x growth phases, ideally in an enterprise B2B environment.
· Strong background in serverless architecture, event-driven systems, and scaling NoSQL databases (Firestore, MongoDB, DynamoDB).
· Expertise in Vue.js for large-scale application performance and maintainability.
· Hands-on experience implementing enterprise security frameworks (SOC2, ISO27001) and compliance requirements.
· Demonstrated daily use of GenAI developer tools such as Cursor, Claude, Lovable, and GitHub Copilot to accelerate coding, documentation, and architecture work.
· Track record of performance optimization for high-traffic production systems.
· 3+ years leading engineering teams through architectural transitions and complex technical challenges.
· Strong communication skills to work with both technical and non-technical stakeholders.
Preferred Qualifications
· Domain knowledge in construction industry workflows or sustainability technology (LCA, carbon accounting).
· Experience with numerical computing, scientific applications, or computationally intensive workloads.
· Familiarity with multi-region deployments and advanced analytics architectures.
· Knowledge of data residency and privacy regulations
· Knowledge of BIM (Building Information Modeling), IFC standards for construction and engineering data interoperability.
Ideal Candidate
You’re a Senior Software Engineer who thrives on scaling complex systems for enterprise customers. You embrace GenAI tools as an integral part of your development workflow, using platforms like Cursor, Claude, Lovable, and GitHub Copilot to deliver faster and smarter. Experience with BIM, IFC, or Speckle is a strong plus, enabling you to bridge sustainability tech with real-world construction data standards. You balance deep technical execution with strategic thinking and can communicate effectively across teams. While direct sustainability or construction tech experience is a plus, your ability to quickly master complex domains is what will set you apart.
Job Description:
We are looking for an experienced SAP TM Functional Consultant with a minimum of 6 years of relevant experience in SAP Transportation Management. The ideal candidate should be skilled in end-to-end implementation and support of SAP TM modules, including Freight Order management, rate calculation, and logistics integration. Prior experience in handling business process configuration and client interaction is a must.
Position : Technology Analyst - 489195
Experience : 5 -8 Years
Job Location : Pune
Notice Period - 0- 15 Days (30 Days serving can be considered)
Note: Apply after careful review of JD and Other details.
Infomatics Corp has been a leader in providing exceptional services to both corporate and
government clients for over 15 years. We are recognized for our commitment to excellence
and security, holding prestigious certifications such as ISO 9001, ISO 27001, and ISO 20000.
With offices in the United States and India, we operate on a global scale, offering in-demand
IT solutions and the flexibility to scale operations according to client needs.
We are looking for ambitious Technology Analyst
Responsibilites:
Test Automation Framework understanding. Well versed in java coding with selenium
Develop and implement automated test frameworks for our web applications
Design, develop, and maintain an automation framework from scratch using best practices
Good hands on experience in TestNG, page object model, BDD Cucumber automation using Java
selenium, Rest Assured, D365
CI-CD pipeline building experience through GITLAB, Jenkins, or Maven
Should possess good communication skills and status reporting to client
Should be good in Agile, functional testing
Mandatory Skills:
Framework Development, Java, Selenium, CI-CD pipeline, API, D365, FT
Job Description:
As Azure Lead Data Engineer, you will have to take over the key activities of designing, coding, and implementing data solutions in the platform. With a focus on leveraging DBT for data transformation and modelling, as well as expertise in MDM tools, you will play a pivotal role in architecting scalable and performant data pipelines and warehouses. You will collaborate closely with cross-functional teams to understand business requirements, architect data solutions, and ensure successful project delivery. You will be leading a team of skilled engineers who will collectively deliver scalable, highly dependable data solutions that can cater to the customers.
Responsibilities:
- Lead the design, development, and implementation of data solutions on the Microsoft Azure platform.
- Architect data pipelines, data warehouses, and data lakes using Azure services such as Azure Data Factory, Azure Databricks, Azure Synapse Analytics, and Azure Blob Storage.
- Design and implement ETL processes to extract, transform, and load data from various sources into Azure data platforms, utilizing DBT for data transformation.
- Develop scalable and efficient data models to support analytics, reporting, and machine learning initiatives, with a strong emphasis on using DBT for modelling.
- Lead performance optimization efforts to ensure the efficient processing of large volumes of data.
- Mentor and coach junior team members, providing guidance on best practices, technical expertise, and professional development.
- Collaborate with stakeholders to understand business requirements and translate them into technical solutions.
- Stay abreast of emerging technologies and industry trends in data engineering and cloud computing.
Qualifications:
- BE Computer Science or a related field.
- ~10 years of experience in data engineering, designing, and implementing data solutions on the Microsoft Azure platform.
- Deep understanding of Azure services such as Azure Data Factory, Azure Databricks, Azure Synapse Analytics, and Azure Blob Storage.
- Proficiency in DBT (Data Build Tool) for data transformation and modelling.
- Experience working with any Master Data Management (MDM) tools.
- Experience with data governance and metadata management tools such as Azure Purview or similar.
- Proficiency in programming languages such as Python, Scala, or Java.
- Experience with big data technologies such as Hadoop, Spark, and Kafka.
- Strong leadership skills with the ability to lead and mentor a team of engineers.



