11+ Adams Software Jobs in Hyderabad | Adams Software Job openings in Hyderabad
Apply to 11+ Adams Software Jobs in Hyderabad on CutShort.io. Explore the latest Adams Software Job opportunities across top companies like Google, Amazon & Adobe.
Job Overview: We are seeking a dedicated Senior Statistical Programmer to join our dynamic team. You will be responsible for the development, quality control, and documentation of statistical (SAS) programming deliverables for clinical research studies.
Key Responsibilities:
1. Lead and oversee the development of SAS programs for the management and statistical analysis of clinical trial data.
2. Develop, test, and validate statistical tables, listings, and graphs (TLGs) in support of the statistical analysis plan.
3. Support the generation and review of protocols, data management plans, study reports, and other regulatory documents.
4. Provide input to the statistical analysis plan, table shells, data integration plans, and mock-ups.
5. Ensure data quality by designing and validating key data checks and listings.
6. Develop specifications for derived datasets and perform data transformation as necessary.
7. Collaborate effectively with data management and biostatistics teams, clinicians, and other project stakeholders.
8. Guide junior programmers and provide mentoring as needed.
9. Keep abreast of new developments in SAS programming and clinical research, and act as a subject matter expert for the team.
Required Qualifications and Skills:
1. Bachelor's or master’s degree in Statistics, Mathematics, Computer Science, or a related field.
2. Minimum of 3 plus years of statistical programming experience in the pharmaceutical, biotech, or CRO industry.
3. Excellent SAS programming skills and proficiency in SAS/Base, SAS/Macro, SAS/Graph, and SAS/STAT.
4. Familiarity with CDISC SDTM and ADaM data standards.
5. In-depth understanding of clinical trial data and extremely complex statistical methods.
6. Excellent problem-solving skills and a proactive approach to identifying and resolving issues.
7. Strong written and verbal communication skills, with the ability to translate complex data into understandable results.
8. Proficiency in English, with excellent written and verbal communication skills.
9. Prior experience leading teams or projects is highly desirable.
Preferred Skills:
1. Experience in oncology, immunology, respiratory, infectious diseases, or neurosciences is a plus.
2. Knowledge of other statistical software such as R or Python is a plus.
3. Knowledge of regulatory guidelines (FDA/EMA/ICH) is preferred.
Review Criteria:
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Role & Responsibilities:
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
About the company
KPMG International Limited, commonly known as KPMG, is one of the largest professional services networks in the world, recognized as one of the "Big Four" accounting firms alongside Deloitte, PricewaterhouseCoopers (PwC), and Ernst & Young (EY). KPMG provides a comprehensive range of professional services primarily focused on three core areas: Audit and Assurance, Tax Services, and Advisory Services. Their Audit and Assurance services include financial statement audits, regulatory audits, and other assurance services. The Tax Services cover various aspects such as corporate tax, indirect tax, international tax, and transfer pricing. Meanwhile, their Advisory Services encompass management consulting, risk consulting, deal advisory, and other related services.
Form Link for quicker response-https://forms.gle/vEYAFUaj8Fgs3sLy6
Job Description
Education Qualification: CA, CFA
Position: Senior & Junior position
Experience: Senior (6-8 years) & Junior (2-4 years)
Location: Pan India with potential requirement to travel to the middle east
Employment Type: contract for 6-12 months (Hybrid)
Responsibilities:
- Record Keeping: Maintain accurate and up-to-date financial records, including transactions, accounts payable/receivable, and payroll.
- Budget Monitoring: Track spending against the approved budget and flag discrepancies.
- Reporting Support: Assist the CFO in preparing financial reports and forecasts.
- Invoice Management: Process payments and manage vendor relations.
- Data Entry: Ensure timely and accurate entry of financial data into accounting systems.
- Tax Compliance: Ensure compliance with local and international tax regulations, including filing returns and payments.
- Tax Planning: Develop tax-efficient strategies to optimize the firm’s financial performance.
- Reporting: Prepare tax reports and liaise with external auditors and tax authorities.
- Regulatory Updates: Stay updated on changes in tax laws and assess their impact on the firm’s operations.
- Risk Mitigation: Identify and mitigate potential tax risks to avoid penalties or disputes with authorities.
About the Role
We are seeking a fast, creative, and technically skilled 3D Explainer Video Producer to transform daily incident briefs into short, high-impact explainers within hours. You will work across a blend of real-time 3D pipelines and AI-powered tools to deliver cinematic-quality videos that help the world understand fast-moving events — all while staying aligned to our visual brand.
Key Responsibilities
Translate incident briefs into concise visual storylines and scene concepts
- Create or AI-generate 3D assets using Blender, Maya, or 3ds Max
- Animate scenes using real-time engines like Unreal Engine or Unity
- Composite titles, graphics, and voiceovers using After Effects and Premiere (or Resolve)
- Integrate AI tools like Runway, Pika, and ElevenLabs to accelerate production
- Maintain a reusable library of templates, transitions, rigs, and sound design elements
- Ensure brand consistency across color, fonts, resolution, aspect ratios, and platform-specific specs
- Collaborate with writers, analysts, and social media teams to refine edits under tight timelines
Required Skills
1+ years of 3D/motion-graphics production experience
Proficiency in Blender, Maya, or 3ds Max
Experience with Unreal Engine or Unity for cinematic rendering
Advanced use of After Effects and Premiere Pro (or DaVinci Resolve)
Strong storyboard literacy and ability to convert briefs into visual sequences
Familiarity with AI tools for asset creation, animation, or TTS
Fast, efficient, and organized with excellent communication skills
Bonus Skills
Journalism, news, or media production experience
Python, C#, or scripting for pipeline automation
Experience with short-form video platforms (YouTube Shorts, Reels, LinkedIn)
Color grading expertise using DaVinci Resolve
We are looking for a Senior Data Engineer with strong expertise in GCP, Databricks, and Airflow to design and implement a GCP Cloud Native Data Processing Framework. The ideal candidate will work on building scalable data pipelines and help migrate existing workloads to a modern framework.
- Shift: 2 PM 11 PM
- Work Mode: Hybrid (3 days a week) across Xebia locations
- Notice Period: Immediate joiners or those with a notice period of up to 30 days
Key Responsibilities:
- Design and implement a GCP Native Data Processing Framework leveraging Spark and GCP Cloud Services.
- Develop and maintain data pipelines using Databricks and Airflow for transforming Raw → Silver → Gold data layers.
- Ensure data integrity, consistency, and availability across all systems.
- Collaborate with data engineers, analysts, and stakeholders to optimize performance.
- Document standards and best practices for data engineering workflows.
Required Experience:
- 7-8 years of experience in data engineering, architecture, and pipeline development.
- Strong knowledge of GCP, Databricks, PySpark, and BigQuery.
- Experience with Orchestration tools like Airflow, Dagster, or GCP equivalents.
- Understanding of Data Lake table formats (Delta, Iceberg, etc.).
- Proficiency in Python for scripting and automation.
- Strong problem-solving skills and collaborative mindset.
⚠️ Please apply only if you have not applied recently or are not currently in the interview process for any open roles at Xebia.
Looking forward to your response!
Best regards,
Vijay S
Assistant Manager - TAG
Manage store operational requirements by scheduling and assigning employees
Maintain results by coaching, counseling and disciplining employees
Prepare annual budget, schedule expenditure, analyze variances and initiate corrective actions
Protect employees and customers by providing a safe and clean store environment
Manage all controllable costs to keep operations profitable
Greetings!!
We are hiring for TAM ( Technical Account Manager) and please find the below details :
CADeploy Engineering Pvt.Ltd (http://www.cadeploy.com/"> www.cadeploy.com ) To give a brief introduction about my Company, we are an emerging MNC Engineering firm with a footprint, serving mid-market and large Organizations in the USA,CANADA,Europe and UK. We provide Mechanical, Civil, Architectural and Structural Engineering solutions in Building Construction, Industrial, Infrastructure, Automotive and Aerospace sectors.
Visit our website http://www.cadeploy.com/">www.cadeploy.com for further details
Job Location : Hyderabad
Shift : Fixed Night Shift
Job Overview
The TAM manages an assigned client base, both engaged and prospective, by acting as a technical and consultative resource that coordinates with other departments within CADeploy to facilitate client needs. The TAM ensures the highest level of client satisfaction through identification of specific requirements, transcribing these into clear communication and follow-through on commitments. The general purpose is to maintain healthy relationships and project performance through continual pulse monitoring with the client. This will include checks on service delivery performance, recognition of special requirements, flagging of risk perceptions, etc. The objective is to build and maintain a strong working affiliation for continuity of account while guiding the technical and managerial execution for efficiency and effectiveness. This role is essential to the growth plan of CADeploy as a foundation to its brand recognition for quality performance.
Responsibilities and Duties
The TAM will be expected to generally execute on the following across an assigned client base:
- Serve as an important point of contact for key clients as well as all internal stakeholders.
- Develop and nurture strong client relationships through top level customer service.
- Know and document client requirements and concerns, understand the details of their business processes and ensure that CADeploy will appropriately meet their evolving needs.
- Provide excellent, regular client communications and responsive, consistent follow-through on all issues and actions.
- Coordinate and assist with the roll-out of CADeploy’s services offering to new clients as required.
- Drive strategic planning and development of improvements and work.
- Handle client requests and assist in the development of quotations/proposals ensuring that CADeploy’s standards are enforced uniformly.
- Facilitate internal project hand-offs to verify that the work performed is complete and meets the client’s, as well as, CADeploy’s standards.
- Provide customized, end-user (staff) communications for all new clients engaged, ensuring a seamless and successful deployment of CADeploy’s services to their specific standards.
- Act as an escalation point for technical and client service issues as necessary.
- Work with the PMO and delivery team to direct troubleshooting efforts on escalated issues as needed.
Regards,
Bhavani
2. Daily report to the reporting manager
3. Should be able to work in agile methodology
4. Should be confortable with State Management in Flutter
5. Should be well versed with programming concepts
Web Developer:
Company Name: 9Roads
Job Location: Khairatabad, Hyderabad
Date of joining: Immediate
Salary Range: INR 16,000 to 20,000
- Immediate Requirement:
Ideal candidate possesses expertise in making UI changes, Adding new features on web page, Mobile App.
Prior work experience is not mandatory, expertise in the below technologies is mandatory.
- Must have: CSS, AJAX, JQuery, PHP, IONIC framework.
- Good to have: Node Js, AWS.
- Able to work in fast paced environment.
- Resource should be able to work independently and deliver from Day1.
- Learning curve - You will get an opportunity to pick skills in AWS, Node Js and Python.





