11+ IDoc Jobs in Hyderabad | IDoc Job openings in Hyderabad
Apply to 11+ IDoc Jobs in Hyderabad on CutShort.io. Explore the latest IDoc Job opportunities across top companies like Google, Amazon & Adobe.
- Hands on exp on FICO
- Good understanding of Finance related Business process and period end closing activities
- exp on sub modules like AP, AR, GL, Asses accounting , TAX and bank accounting
- tech aspects like idocs, user exits,BADI's workflow
- integration with SD and MM
- basic knowledge to controlling - COPA, PCA, CCA
- good communication skils
- interpersonal skills
- analyze, design, configuration , test , implement SAP Solutions to meet business requirement
We’re Hiring: Estimator – Exterior Windows, Doors & Storefronts 🏗️
📍 Location: Hyderabad, India
💼 Type: Full-Time | Reports to Estimating Manager
✨ Key Responsibilities:
🔹 Read & analyze architectural drawings 📐
🔹 Perform quantity takeoffs & prepare cost estimates 💰
🔹 Use AutoCAD, Excel, OST & Bluebeam for estimates 💻
🔹 Coordinate with vendors & subcontractors 🤝
🔹 Support project managers during preconstruction 🏢
🔹 Maintain organized estimate data 📁
🎓 Qualifications:
✅ Bachelor’s in Construction / Engineering / Architecture
✅ 2–3 yrs experience in glazing, façade, or exterior systems
✅ Proficient in AutoCAD & Excel
✅ Knowledge of OST, Bluebeam & Revit (plus)
⭐ Preferred:
🏢 Experience with curtain wall, storefront, window wall & doors
🧠 Strong analytical, math & organizational skills
⚡ Ability to multitask & meet deadlines
Job Description
We are seeking a highly skilled DevOps / Kubernetes Engineer. The ideal candidate will have strong expertise in container orchestration, infrastructure as code, and GitOps workflows, with hands-on experience in Azure cloud environments. You will be responsible for designing, deploying, and managing modern cloud-native infrastructure and applications at scale.
Key Responsibilities:
· Manage and operate Kubernetes clusters (AKS / K3s) for large-scale applications.
· Implement infrastructure as code using Terraform or OpenTofu for scalable, reliable, and secure infrastructure provisioning.
· Deploy and manage applications using Helm and ArgoCD with GitOps best practices.
· Work with Podman and Docker as container runtimes for development and production environments.
· Collaborate with cross-functional teams to ensure smooth deployment pipelines and CI/CD integrations.
· Optimize infrastructure for cost, performance, and reliability within Azure cloud.
· Troubleshoot, monitor, and maintain system health, scalability, and performance.
Required Skills & Experience:
· Strong hands-on experience with Kubernetes (AKS / K3s) cluster orchestration.
· Proficiency in Terraform or OpenTofu for infrastructure as code.
· Experience with Helm and ArgoCD for application deployment and GitOps.
· Solid understanding of Docker / Podman container runtimes.
· Cloud expertise in Azure with experience deploying and scaling workloads.
· Familiarity with CI/CD pipelines, monitoring, and logging frameworks.
· Knowledge of best practices around cloud security, scalability, and high availability.
Preferred Qualifications:
· Contributions to open-source projects under Apache 2.0 / MPL 2.0 licenses.
· Experience working in global distributed teams across CST/PST time zones.
· Strong problem-solving skills and ability to work independently in a fast-paced environment.
Familiarity with Agile development methodologies.
• Areas of expertise: Go, Electron, NodeJs.
• Developer in Linux and Windows environments
• Proficient in software development tools such as IDEs, debuggers, profilers, source control systems.
• Strong in coding languages (e.g. C, C++, Go, Java) and frameworks (e.g. NodeJs, Electron, Git)
Preferred Qualifications
The ideal candidate will possess the following experience:
• Good subject matter expertise with Kubernetes, Docker and other container orchestration tool.
• Practical experience developing, testing and operating a service in a hybrid cloud.
• Outstanding coding/scripting skills, preferably in Go.
• Experience with Linux/Unix
• Experience in Jenkins and CI/CD environment
• Strong oral and written communication skills
Big Data Engineer- SCALA
Required Skills:
1. Experience in building modern and scalable REST – based microservices using Scala, preferably with Play as MVC framework.
2. Expertise with functional programming using SCALA
3. Experience in implementing RESTful web services in Scala, Java or similar languages.
4. Experience with No SQL and SQL databases.
5. Experience in information retrieval and machine learning
6. Experience/ knowledge in big data using Scala spark, ML, Kafka, and Elastic search will be plus.
We are looking for a motivated full-stack engineer who is proficient in both front-end and back-end technologies. Responsibilities include designing, building, and maintaining web applications, databases, and APIs that enable businesses to operate efficiently.
Responsibilities
- Develop, test, and deploy fast and scalable web apps.
- Designing and maintenance of fully functional large relational and non-relational databases.
- Timely deployment of web apps on the cloud.
- Server management and cloud-based infrastructure.
- Establishment and integration of development tools as required.
- Conducting code reviews of peer developers.
Requirements
- 2 years in MEAN Stack Development.
- Expertise in technologies - MongoDB, ExpressJS, and NodeJS.
- High-quality programming skills for a robust design.
- Experience in application architecture, server management, cross-browser compatibility, responsive design, and website performance.
- Understanding of DB architecture design and programming templates, Agile methodologies, and client-side and server-side procedures.
- BE/ B. Tech/ MCA.

at Altimetrik
• Strong Restful API, Micro-services development experience using ASP.NET CORE Web APIs (C#);
• Must have exceptionally good software design and programming skills in .Net Core (.NET 3.X, .NET 6) Platform, C#, ASP.net MVC, ASP.net Web API (RESTful), Entity Framework & LINQ
• Good working knowledge on Azure Functions, Docker, and containers
• Expertise in Microsoft Azure Platform - Azure Functions, Application Gateway, API Management, Redis Cache, App Services, Azure Kubernetes, CosmosDB, Azure Search, Azure Service Bus, Function Apps, Azure Storage Accounts, Azure KeyVault, Azure Log Analytics, Azure Active Directory, Application Insights, Azure SQL Database, Azure IoT, Azure Event Hubs, Azure Data Factory, Virtual Networks and networking.
• Strong SQL Server expertise and familiarity with Azure Cosmos DB, Azure (Blob, Table, queue) storage, Azure SQL etc
• Experienced in Test-Driven Development, unit testing libraries, testing frameworks.
• Good knowledge of Object Oriented programming, including Design Patterns
• Cloud Architecture - Technical knowledge and implementation experience using common cloud architecture, enabling components, and deployment platforms.
• Excellent written and oral communication skills, along with the proven ability to work as a team with other disciplines outside of engineering are a must
• Solid analytical, problem-solving and troubleshooting skills
Desirable Skills:
• Certified Azure Solution Architect Expert
o Microsoft Certified: Azure – Fundamentals Exam AZ-900
o Microsoft Certified: Azure Administrator – Associate Exam AZ-104
o Microsoft Certified: Azure Developer – Associate Exam AZ-204
o Microsoft Certified: DevOps Engineer Expert (AZ-400)
o Microsoft Certified: Azure Solutions Architect Expert (AZ-305)
• Good understanding of software architecture, scalability, resilience, performance;
• Working knowledge of automation tools such as Azure DevOps, Azure Pipeline or Jenkins or similar
Roles & Responsibilities
• Defining best practices & standards for usage of libraries, frameworks and other tools being used;
• Architecture, design, and implementation of software from development, delivery, and releases.
• Breakdown complex requirements into independent architectural components, modules, tasks and strategies and collaborate with peer leadership through the full software development lifecycle to deliver top quality, on time and within budget.
• Demonstrate excellent communications with stakeholders regarding delivery goals, objectives, deliverables, plans and status throughout the software development lifecycle.
• Should be able to work with various stakeholders (Architects/Product Owners/Leadership) as well as team - Lead/ Principal/ Individual Contributor for Web UI/ Front End Development;
• Should be able to work in an agile, dynamic team environment;
· Oracle CPQ Configuration/BML Development knowledge to meet documented business process requirements.
· Develop and support Configuration, Commerce, integrations, reports, workflow, BML and custom development in a CPQ environment.
· Working knowledge on complex product hierarchy and pricing models
· Expertise in BOM, Integrations
· Good knowledge in REST APIs, CSS, Java Script (CPQJS APIs) and XSL.
· Experience in handling CPQ Migrations and release upgrades.
· Better understanding of CPQ best practices and addressing performance related issues.
· Should be able to troubleshoot and resolve the system issues
· Playing a critical role in testing, evaluating and delivering high quality end product.
· Sound knowledge in Quote to Cash process.
Skills – Jboss, DevOps, ServiceNow, Windows Server.
JD - Application Maintenance -- Must have -- Installation and configuration of Custom/Standard Software e.g., FileZilla, JDK, OpenJDK Installation and configuration of JBOSS/Tomcat Server, Configuration of HTTPS certificate in JBOSS/Tomcat, Windows Event Viewer/IIS Log/ Windows Security /Active Directory, how to set Environment variable, Registry value, etc. Nice to Have --- Basics of Monitoring Knowledge of PowerShell, MS Azure Devops, Deploy and Configure application, how to check Last installed version of any software/patch, ServiceNow , ITIL , Incident Management , Change Management
Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.
Job Description
Experience : 7+ years
Location : Pune / Hyderabad
Skills :
- Drive and participate in requirements gathering workshops, estimation discussions, design meetings and status review meetings
- Participate and contribute in Solution Design and Solution Architecture for implementing Big Data Projects on-premise and on cloud
- Technical Hands on experience in design, coding, development and managing Large Hadoop implementation
- Proficient in SQL, Hive, PIG, Spark SQL, Shell Scripting, Kafka, Flume, Scoop with large Big Data and Data Warehousing projects with either Java, Python or Scala based Hadoop programming background
- Proficient with various development methodologies like waterfall, agile/scrum and iterative
- Good Interpersonal skills and excellent communication skills for US and UK based clients
About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle – Data warehouse Assessment & Migration Planning Product
Raven – Automated Workload Conversion Product
Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
www.datametica.com
Technical Document writer
About Aviso
Aviso is the AI Compass that guides Sales and Go-to-Market teams to close more deals, accelerate revenue growth, and find their True North. Aviso delivers true revenue intelligence, nudges team-wide actions, and gives precise guidance so sellers and teams don’t get lost in the fog of CRM, scattered data lakes, and human biases.
We are a global company with offices in Redwood City, San Francisco, Hyderabad, and Bangalore. Our customers are innovative leaders in their market. We are proud to count Dell, Honeywell, MongoDB, Glassdoor, Splunk, FireEye, and RingCentral as our customers, helping them drive revenue, achieve goals faster, and win in bold new frontiers. Aviso is backed by Storm Ventures, Shasta Ventures, Scale Venture Partners and leading Silicon Valley technology investors.
What you will be doing
- The documentation effort includes evaluation, document preparation (organize, write, capture screen images, index), editing and review with the project team.
- Ability to grasp and visualize enterprise software and infrastructure concepts.
- Ability to produce world class documentation.
- Work closely with the product engineering team and gain deep understanding of the product features, behaviour and requirements.
- Writing of any and all User Manuals for Aviso
- Work with the QA team to produce high quality QA documents.
- Update and review existing documents for every release after analysing the changes. Own/Manage these documentations in central repository with a proper folder structure.
- Organize the technical material into various categories of documents.
- Write world class technical content that is clear and consistent with appropriate diagrams, charts and screen shots of the product for better illustration as per established document standards.
- Attend cross-functional planning meetings and scrum meetings
- Coordinate with subject matter experts to ensure that the content is accurate and current
- Provide daily/weekly activity status to the project heads and communicate with them via email, IM and video conference.
- Should be able to present information in a simple, concise manner. Should pay attention to detail.
What you bring
- Qualification: BE/BTech,ME/MTech/MCA background
- You must have minimum 4 years in Technical Writing.
- Must have excellent written and oral communication skills
- Must have the ability to grasp and communicate technical concepts
- Must have organization and analytical skills
- You should have Experience in modular documentation using XML Editors
- You will be Understanding of latest technologies in technical communication – added advantage
- Demonstrates the ability to analyze and communicate information needs at both the conceptual and detail level
- Experience in working in an Agile environment would be an advantage
Aviso offers
- Dynamic, diverse, inclusive startup environment driven by transparency and velocity
- Bright, open, sunny working environment and collaborative office space
- Convenient office locations in Redwood City, Hyderabad and Bangalore tech hubs
- Competitive salaries and company equity, and a focus on developing world class talent operations
- Comprehensive health insurance available (medical) for you and your family
- Unlimited leaves with manager approval and a 3 month paid sabbatical after 3 years of service
- CEO moonshots projects with cash awards every quarter
- Upskilling and learning support including via paid conferences, online courses, and certifications
- Every month Rupees 2,500 will be credited to Sudexo meal card





