
Manager – FP&A Revenue & Contribution - For Career break Candidates
· People Management – Delegating, Motivating and developing the team.
· Effective communication, presentation and management reporting skills.
· Ability to work with a high level of accuracy within assigned timelines.
· Adaptive and Innovative mindset.
· Readiness to take up challenges and develop new ways of working.
· Ability to work collaboratively within and outside the Team for a common objective.
· Expert knowledge of DBMS, SQL, Code development.
· Good hands on IT tools like MS Excel, Access, PPT, Qlik or other Reporting software.
· Strong analytical / problem solving skills.
Experience
· Minimum 8 years’ overall experience. Minimum 3 year of Reporting/Analytics experience in a supervisory role is preferable.

About Colt Technology Services
About
Similar jobs
Review Criteria:
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Role & Responsibilities:
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
We at CLOUDSUFI are hiring for Workato AI Integration Consultants.
Location - Remote
Type - Contractual (6months and extendable)
Shift - 3pm-12am
Qualifications:
• Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field, or equivalent practical experience.
• Proven 5+ years of hands-on experience in designing, developing, and deploying integrations using the Workato platform.
• Strong understanding of Integration Platform as a Service (iPaaS) concepts and enterprise application integration (EAI) patterns.
• Demonstrated experience with Workato's AI capabilities, including building solutions with AI by Workato, IDP, or Agentic AI.
• Proficiency in working with various APIs (REST, SOAP), webhooks, and different data formats (JSON, XML, CSV).
• Experience with scripting languages (e.g., Ruby, Python, JavaScript) for custom logic within Workato recipes is a plus.
• Solid understanding of database concepts and experience with SQL/NoSQL databases.
• Excellent problem-solving, analytical, and troubleshooting skills with keen attention to detail.
• Strong communication and interpersonal skills, with the ability to effectively collaborate with technical and non-technical stakeholders.
• Workato certifications (e.g., Automation Pro I, II, III, Integration Developer) are desirable.
• Experience with Agile/Scrum methodologies is a plus.
Preferred Skills (Nice to Have):
• Experience with other integration platforms (e.g., MuleSoft, Boomi, Zapier).
• Familiarity with cloud platforms (AWS, Azure, GCP).
• Knowledge of microservices architecture.
• Experience with version control systems (e.g., Git).
**
For the role of dot net developer, the responsibilities of the developers include following a list of tasks:
To develop, design, test, maintain and support custom mobile and web applications
To collect the software requirement by customer and develop the application as per their requirement
Evaluate and research the products and technologies related to software
To design and develop the activities and procedures related to activities and maintenance
To write the high-quality code to meet customer requirement
To design, develop and implement the critical applicationfor.Net environment
Alienate the project work as per client’s requirement and budget
Role Summary
As a Data Engineer, you will be an integral part of our Data Engineering team supporting an event-driven server less data engineering pipeline on AWS cloud, responsible for assisting in the end-to-end analysis, development & maintenance of data pipelines and systems (DataOps). You will work closely with fellow data engineers & production support to ensure the availability and reliability of data for analytics and business intelligence purposes.
Requirements:
· Around 4 years of working experience in data warehousing / BI system.
· Strong hands-on experience with Snowflake AND strong programming skills in Python
· Strong hands-on SQL skills
· Knowledge with any of the cloud databases such as Snowflake,Redshift,Google BigQuery,RDS,etc.
· Knowledge on debt for cloud databases
· AWS Services such as SNS, SQS, ECS, Docker, Kinesis & Lambda functions
· Solid understanding of ETL processes, and data warehousing concepts
· Familiarity with version control systems (e.g., Git/bit bucket, etc.) and collaborative development practices in an agile framework
· Experience with scrum methodologies
· Infrastructure build tools such as CFT / Terraform is a plus.
· Knowledge on Denodo, data cataloguing tools & data quality mechanisms is a plus.
· Strong team player with good communication skills.
Overview Optisol Business Solutions
OptiSol was named on this year's Best Companies to Work for list by Great place to work. We are a team of about 500+ Agile employees with a development center in India and global offices in the US, UK (United Kingdom), Australia, Ireland, Sweden, and Dubai. 16+ years of joyful journey and we have built about 500+ digital solutions. We have 200+ happy and satisfied clients across 24 countries.
Benefits, working with Optisol
· Great Learning & Development program
· Flextime, Work-at-Home & Hybrid Options
· A knowledgeable, high-achieving, experienced & fun team.
· Spot Awards & Recognition.
· The chance to be a part of next success story.
· A competitive base salary.
More Than Just a Job, We Offer an Opportunity To Grow. Are you the one, who looks out to Build your Future & Build your Dream? We have the Job for you, to make your dream comes true.
Our client has the backing of one of India's leading Business houses which has a turnover of Rs.100 billion. The company produces excellent quality writing and printing paper, and they are the leading manufacturer of tissue and board, as well as Rayon Grade Pulp (RGP) products. They have the largest single location manufacturing facility in India with an estimated revenue of $24.1 million. Our client continues to be the committed employer of the local community and has achieved many international standards like OHSAS and FSC as a leading producer of pulp and paper.
As the Sales Executive (Retail-GT)-Tissue, you will be responsible for handling Primary Sales, Secondary sales and Collection for the company.
What you will do:
- Being involved in setting up and strengthening a distribution network (Super-stockiest & Distributors) at the areas mentioned above respectively.
- Leading the initiative of launching the products in the geographical areas mentioned above and initiate BTL activities for better placement of the products.
- Involving sales promotion, merchandising to boost the secondary sales.
- Studying the market potential and suggest recruitment of DSM at the respective location to improve the market penetration.
- Ensuring that products are available in all the major retail outlets.
- Effectively managing the team and ensure high ratio of Productive sales calls.
- Motivating, and empowering the sales representatives for increased productivity by working with them on a regular basis and creating synergy amongst the team.
What you need to have:
- Candidates with 1-3 years of experience from FMCG Company only. Personal Care & Beverage Industry are preferred
- Ready to travel outstation once in a while
- Excellent negotiation and networking skills
- Strong contacts and existing network of buyers will be an added advantage.
- Candidate with good knowledge about Strong communication skills
Essential Skills:
- MS BizTalk
- Good Experience in Mapping, Orchestration, Deployments.
- Good to have experience in System Support (Operations)
- Knowledge on different Adapters/Configurations
Preferred Skills:
- Development Experience or Knowledge on MS Azure – Logic Apps, Functions, Service Bus, Storage.
- BizTalk 360
- C# or any Programming language basics
- Azure DevOps (CI/CD)
commercial, federal and state government plans specifically support operational
processes for provider data management functions, enrollment and eligibility,
member benefits, claims adjudication and EDI Interfaces functions.
• Knowledge of State regulations to determine provider meets qualification to be
enrolled.
• Knowledge of provider files from State Medicaid programs to decipher provider
enrollment and eligibility rules.
• Knowledge of provider type designation identified by the State Medicaid programs.
• Knowledge of provider and member portal functions.
• Knowledge of provider reimbursement methodologies about Commercial, CMS and
State defined guidelines.
• Knowledge of provider matching criteria for claims.
• Knowledge of Coordination of Benefits (COB) and Claim Authorizations functions.
• Knowledge of Setting up communications for EDI transmissions using FTP, SFTP and
real time API setups
• Knowledge of health insurance, HMO and managed care principles including
Medicaid and Medicare regulation.
• Solid analytical skills with the ability to compile data from many sources and define
designs for enrollment to benefit plan configuration.
• Research, interpret and summarize new state, federal and client rules regarding
department functions. Alter or create policies and procedures to adhere to those
rules.
• Solid communication skills with working session facilitation.
• Strong time management, attention to detail, analytic and organizational skills.
• Excellent interpersonal, oral and written communication skills.
• Able to work independently and within a collaborative team environment with little
guidance/supervision.

Responsibilities:
- Identify complex business problems and work towards building analytical solutions in-order to create large business impact.
- Demonstrate leadership through innovation in software and data products from ideation/conception through design, development and ongoing enhancement, leveraging user research techniques, traditional data tools, and techniques from the data science toolkit such as predictive modelling, NLP, statistical analysis, vector space modelling, machine learning etc.
- Collaborate and ideate with cross-functional teams to identify strategic questions for the business that can be solved and champion the effectiveness of utilizing data, analytics, and insights to shape business.
- Contribute to company growth efforts, increasing revenue and supporting other key business outcomes using analytics techniques.
- Focus on driving operational efficiencies by use of data and analytics to impact cost and employee efficiency.
- Baseline current analytics capability, ensure optimum utilization and continued advancement to stay abridge with industry developments.
- Establish self as a strategic partner with stakeholders, focused on full innovation system and fully supportive of initiatives from early stages to activation.
- Review stakeholder objectives and team's recommendations to ensure alignment and understanding.
- Drive analytics thought leadership and effectively contributes towards transformational initiatives.
- Ensure accuracy of data and deliverables of reporting employees with comprehensive policies and processes.








