11+ Transformer Jobs in Pune | Transformer Job openings in Pune
Apply to 11+ Transformer Jobs in Pune on CutShort.io. Explore the latest Transformer Job opportunities across top companies like Google, Amazon & Adobe.
Job Description
Phonologies is seeking a Senior Data Engineer to lead data engineering efforts for developing and deploying generative AI and large language models (LLMs). The ideal candidate will excel in building data pipelines, fine-tuning models, and optimizing infrastructure to support scalable AI systems for enterprise applications.
Role & Responsibilities
- Data Pipeline Management: Design and manage pipelines for AI model training, ensuring efficient data ingestion, storage, and transformation for real-time deployment.
- LLM Fine-Tuning & Model Lifecycle: Fine-tune LLMs on domain-specific data, and oversee the model lifecycle using tools like MLFlow and Weights & Biases.
- Scalable Infrastructure: Optimize infrastructure for large-scale data processing and real-time LLM performance, leveraging containerization and orchestration in hybrid/cloud environments.
- Data Management: Ensure data quality, security, and compliance, with workflows for handling sensitive and proprietary datasets.
- Continuous Improvement & MLOps: Apply MLOps/LLMOps practices for automation, versioning, and lifecycle management, while refining tools and processes for scalability and performance.
- Collaboration: Work with data scientists, engineers, and product teams to integrate AI solutions and communicate technical capabilities to business stakeholders.
Preferred Candidate Profile
- Experience: 5+ years in data engineering, focusing on AI/ML infrastructure, LLM fine-tuning, and deployment.
- Technical Skills: Advanced proficiency in Python, SQL, and distributed data tools.
- Model Management: Hands-on experience with MLFlow, Weights & Biases, and model lifecycle management.
- AI & NLP Expertise: Familiarity with LLMs (e.g., GPT, BERT) and NLP frameworks like Hugging Face Transformers.
- Cloud & Infrastructure: Strong skills with AWS, Azure, Google Cloud, Docker, and Kubernetes.
- MLOps/LLMOps: Expertise in versioning, CI/CD, and automating AI workflows.
- Collaboration & Communication: Proven ability to work with cross-functional teams and explain technical concepts to non-technical stakeholders.
- Education: Degree in Computer Science, Data Engineering, or related field.
Perks and Benefits
- Competitive Compensation: INR 20L to 30L per year.
- Innovative Work Environment for Personal Growth: Work with cutting-edge AI and data engineering tools in a collaborative setting, for continuous learning in data engineering and AI.
- Oracle Integration Cloud (OIC) Development:
- Design, develop, and implement integration solutions using Oracle Integration Cloud (OIC).
- Build integrations between Oracle Cloud applications and on-premise or third-party applications.
- Create and maintain REST/SOAP-based web services and API calls for integration solutions.
- Implement process automation using OIC’s Process Automation capabilities.
- Monitor and manage integration flows to ensure optimal performance, security, and scalability.
- Platform-as-a-Service (PaaS) Expertise:
- Develop custom applications and solutions on Oracle PaaS (Java Cloud Service, Application Container Cloud, etc.).
- Deploy, manage, and support PaaS solutions for extending Oracle SaaS applications.
- Maintain cloud environments and ensure seamless integrations between PaaS solutions and Oracle applications.
- System Integration:
- Work with ERP systems like Oracle Fusion, EBS, and other third-party systems to build integrated solutions.
- Troubleshoot and resolve integration issues related to data flow, connectivity, and performance.
- Develop mappings, orchestration flows, and B2B/EDI integrations using the Oracle Integration Cloud.
- Stakeholder Collaboration:
- Collaborate with business analysts, functional teams, and IT teams to gather integration requirements and deliver effective solutions.
- Provide input into solution architecture and development strategies aligned with business needs.
- Document technical designs, configurations, and troubleshooting steps for future reference and team collaboration.
- Continuous Improvement:
- Stay updated with the latest Oracle Cloud and PaaS advancements, features, and best practices.
- Optimize integration processes to improve overall system efficiency, reduce errors, and ensure data integrity.
- Conduct performance tuning and optimize OIC flows for seamless data exchange between systems.
Required Skills & Qualifications:
- Education:
- Bachelor's Degree in Computer Science, Information Technology, or a related field (or equivalent experience).
- Experience:
- 5-10 years of hands-on experience in Oracle Integration Cloud (OIC) development and PaaS.
- Proven experience in developing and managing complex integrations between cloud and on-premise systems.
- Strong understanding of Oracle Cloud Applications (Fusion, HCM, ERP, etc.) and their integration requirements.
- Technical Skills:
- Expertise in Oracle Integration Cloud (OIC), Oracle SOA Suite, Oracle Cloud Infrastructure (OCI), and Oracle PaaS solutions.
- Hands-on experience with RESTful/SOAP web services, API development, JSON, XML, and XSLT.
- Proficiency with Oracle ERP, Oracle Fusion Middleware, Oracle Visual Builder, and related tools.
- Strong knowledge of cloud integration patterns, process orchestration, and B2B integrations.
- Familiarity with Oracle Autonomous Databases and Oracle SQL/PLSQL.
- Soft Skills:
- Excellent problem-solving skills with attention to detail.
- Strong communication and collaboration abilities across functional and technical teams.
- Ability to work independently and manage multiple priorities in a fast-paced environment.
Preferred Qualifications:
- Oracle Cloud certifications in OIC, SOA, or PaaS.
- Knowledge of DevOps practices and CI/CD pipelines for cloud deployment.
- Experience with cloud security best practices for PaaS and OIC.
- Strong experience in Core Java, Multi-Threading, data structures (List/Map/Set) and unit testing
- with (Junit, Mockito)
- Strong experience with Spring Framework (Spring MVC, Spring REST, Spring Data), ORM
- frameworks (JPA, Hibernate) and RDBMS (Oracle/MySQL/Postgres)
- Strong experience in applying Object-Oriented design principles and Design Patterns
- Good knowledge of Multi-Tier architecture, Micro Services architecture and Service Oriented
- Architecture
- Exposure to AWS Cloud, NoSQL Database (Mongo, Cassandra), Message Broker (Active
- MQ/RabbitMQ/Apache Kafka) and Big Data Technologies (Hadoop/Hive/Spark)
- Additionally, must possess capability to review code, produce technical specification document
- and knowledge of code quality tools (PMD/FindBugs/Sonar)
Are you ready to revolutionize the manufacturing landscape? We're on the hunt for a dynamic OT-IT Expert who’s not just skilled but passionate about transforming operational technology! Join us in optimizing our manufacturing shop floors by auditing and enhancing infrastructure and connectivity. Your mission? Identify those sneaky gaps in our data collection architecture and ensure our machines and sensors talk smoothly to the cloud. If you thrive in IIoT environments and can navigate the complexities of communication protocols like a pro, we want you on our team!
Key Responsibilities:
- Audit Like a Pro: Dive deep into our existing infrastructure and connectivity on the manufacturing shop floor.
- Gap Detective: Identify gaps in our architecture for seamless data collection from machines and sensors to cloud systems.
- Solution Architect: Design and propose the best-suited, cost-effective solution architectures based on your gap analysis.
- Proposal Wizard: Develop and prepare detailed project proposals that impress stakeholders.
- Supplier Negotiator: Identify and negotiate with suppliers for the required hardware and peripherals to bring your designs to life.
- Challenge Anticipator: Plan ahead for potential implementation challenges and devise clever solutions.
- Project Overlord: Oversee IIoT project implementations to ensure they stay on track and meet objectives.
- Data Accuracy Guardian: Ensure the accuracy of captured data by validating against actual shop floor values.
- Data Structurer: Develop and implement data format structuring and standard frameworks that enhance our operations.
- Quality Assurance Champion: Perform thorough Quality Assurance (QA) on collected data before handing it over to our data engineers.
Required Skills and Experience:
- Extensive experience in IIoT projects involving field communication protocols (Modbus, OPC, PROFINET, PROFIBUS).
- Strong knowledge of PLC programming with hands-on experience in multiple PLC brands (Siemens, Allen-Bradley, Mitsubishi).
- Proficient in auditing and analyzing OT infrastructure like a seasoned expert.
- Expertise in cloud connectivity and data collection systems.
- Creative thinker with the ability to design scalable and cost-effective IIoT architectures.
- Strong negotiation skills to manage vendor relationships effectively.
- Exceptional problem-solving and forward-thinking abilities.
- Solid understanding of manufacturing processes and shop floor operations.
- Proven experience in data validation, structuring, and quality assurance.
- Familiarity with data engineering concepts and the ability to liaise with data engineering teams.
Preferred Qualifications:
- Bachelor's degree in Instrumentation, Information Technology, Electrical Engineering, or a related field.
- Relevant certifications in Industrial Automation or IIoT technologies.
- 5+ years of experience in OT/IT convergence projects within manufacturing environments.
- Experience with data standardization and framework development in industrial settings.
Key Attributes:
- Strategic thinker with a hands-on approach who can get things done.
- Excellent communication skills to connect with both technical and non-technical stakeholders.
- Proactive problem-solver with a history of successful project deliveries.
- Adaptable to rapidly evolving technology landscapes, ready for anything!
- Meticulous attention to detail, especially in data accuracy and quality.
- Strong analytical skills for data validation and structure optimization.
Your key responsibilities
- Create and maintain optimal data pipeline architecture. Should have experience in building batch/real-time ETL Data Pipelines. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
- The individual will be responsible for solution design, integration, data sourcing, transformation, database design and implementation of complex data warehousing solutions.
- Responsible for development, support, maintenance, and implementation of a complex project module
- Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint
- Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation
- Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards
- Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support.
- complete reporting solutions.
- Preparation of HLD about architecture of the application and high level design.
- Preparation of LLD about job design, job description and in detail information of the jobs.
- Preparation of Unit Test cases and execution of the same.
- Provide technical guidance and mentoring to application development teams throughout all the phases of the software development life cycle
Skills and attributes for success
- Strong experience in SQL. Proficient in writing performant SQL working with large data volumes. Proficiency in writing and debugging complex SQLs.
- Strong experience in database system Microsoft Azure. Experienced in Azure Data Factory.
- Strong in Data Warehousing concepts. Experience with large-scale data warehousing architecture and data modelling.
- Should have enough experience to work on Power Shell Scripting
- Able to guide the team through the development, testing and implementation stages and review the completed work effectively
- Able to make quick decisions and solve technical problems to provide an efficient environment for project implementation
- Primary owner of delivery, timelines. Review code was written by other engineers.
- Maintain highest levels of development practices including technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability
- Must have understanding of business intelligence development in the IT industry
- Outstanding written and verbal communication skills
- Should be adept in SDLC process - requirement analysis, time estimation, design, development, testing and maintenance
- Hands-on experience in installing, configuring, operating, and monitoring CI/CD pipeline tools
- Should be able to orchestrate and automate pipeline
- Good to have : Knowledge of distributed systems such as Hadoop, Hive, Spark
To qualify for the role, you must have
- Bachelor's Degree in Computer Science, Economics, Engineering, IT, Mathematics, or related field preferred
- More than 6 years of experience in ETL development projects
- Proven experience in delivering effective technical ETL strategies
- Microsoft Azure project experience
- Technologies: ETL- ADF, SQL, Azure components (must-have), Python (nice to have)
Ideally, you’ll also have
Designation: Kafka Consultant
Job Description:
- 3-8+ years of experience programming in a backend language (Java / Python), with a good understanding of troubleshooting errors.
- 5+ years of experience in Confluent Kafka / 3+ years of experience in Confluent Kafka
- Cloud Kafka, Control Central, Rest Proxy, HA Proxy, Confluent Kafka Connect, Confluent Kafka Security features
- 3-5+ years of working experience with different databases such as Oracle/Mysql/MongoDB/Snowflake/RDS databases.
- 3-5+ years of experience with AWS/GCP/Azure preferred AWS. AWS Solutions Architect certifications are preferred.
- 2+ years of experience with Monitoring tools (Prometheus, Grafana / Spl
- Proficient in building/scaling large systems end to end using Tibco BW5, BW6 or other high-performance languages,
- developer tools, CI/CD, DevOps, GitHub, Terraform, and Ansible.
Job Description: Full Stack Developer (Angular/API)
- Experience in developing web and mobile applications using Frontend & Backend framework
- Hands-on Experience in developing frontend UI using Angular.
- Hands-on experience in API development preferably using .net Core.
- Hands-on Experience with ORM solutions like Entity Framework, Dapper, hibernate
- Strong knowledge of front-end UI frameworks like Bootstrap, Material, Prime NG
- Strong Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery)
- Knowledge of cloud technology like Azure/AWS
- Must have strong OOPS knowledge
- Must have Knowledge Design Principle
- Strong knowledge of writing DB Query, Proc, View, indexing & Query Optimization skill
- Must know SDLC/AGILE Software development skill
- Must know source code management tool TFS, SVN OR GIT
- Willingness to participate in organization initiatives and create a reusable library, POC
- Inclination towards cutting-edge technology and the latest framework
- Strong troubleshooting skills and Analytical mind
- Excellent communication and team player.
The must-have requirements: Preferably the candidate should have their own mode of transport. Travel Expense reimbursement will be provided.
- Preference: Local candidates/ candidates who know the city well in terms of areas and student hubs would be preferred.
- Experience: 0-2 years
- Job Type: Field Job
- Own vehicle required
- Excellent communication and negotiation skills
- Salary: 2LPA – 3LPA
- No. Of Openings: 5
Job Description:
- Conduct research in a specified area and collect information as per the guidance of City Head
- Assisting the city heads in pitching to the vendors (brands) and bringing them onboard. This process includes registration, briefing and providing contracts as per discussed norms and policies of the company.
- Working in tandem with the research team to understand potential vendors well.
- Formation of an on-ground report on the basis of accumulated data and submitting it to the board so that further strategies could be implemented.
- Building a network on the basis of your location so that the company can form proper channels and develop connectivity within the city.
- Submitting critical data collected about vendors to the team and ensuring all privacy policies are followed.
- Being creative in your sales pitch, contributing ideas and helping the team grow.
- Build long-term trusting relationships with clients
- Proactively seek new business opportunities in the market.






