11+ FICA Jobs in India
Apply to 11+ FICA Jobs on CutShort.io. Find your next job, effortlessly. Browse FICA Jobs and apply today!
FICA (AM – FICA)
- SAP IS-U FICA Functional Support
Ø Conceptualizing & Mapping of Business Scenarios
Ø Business requirement understanding same to be given to developer
Ø Functional support for all Collection(FICA) processes.
Ø Having expertise & understanding the migration activity
• SAP IS-U Developments
Ø Preparing Functional Specification Document
Ø Implementation/Execution of developmental logic
Ø Technical / Functional testing of IS-U FICA developments
• SAP Roles & Authorization
Role assignment to user ,New role/object/field creation according to requirement.
Role Overview:
We are seeking a backend-focused Software Engineer with deep expertise in REST APIs,
real-time integrations, and cloud-based application services. The ideal candidate will build
scalable backend systems, integrate real-time data flows, and contribute to system design
and documentation. This is a hands-on role working with global teams in a fast-paced, Agile
environment.
Key Responsibilities:
• Design, develop, and maintain REST APIs and backend services using Python, FastAPI,
and SQLAlchemy.
• Build and support real-time integrations using AWS Lambda, API Gateway, and
EventBridge.
• Develop and maintain Operational Data Stores (ODS) for real-time data access.
• Write performant SQL queries and work with dimensional data models in PostgreSQL.
• Contribute to cloud-based application logic and data orchestration.
• Containerize services using Docker and deploy via CI/CD pipelines.
• Implement automated testing using pytest, pydantic, and related tools.
• Collaborate with cross-functional Agile teams using tools like Jira.
• Document technical workflows, APIs, and system integrations with clarity and
consistency.
• Should have experience in team management
Required Skills & Experience:
• 8+ years of backend or integrations engineering experience.
• Expert-level knowledge of REST API development and real-time system design.
• Strong experience with: Python (FastAPI preferred), SQLAlchemy.
• PostgreSQL and advanced SQL.
• AWS Lambda, API Gateway, EventBridge.
• Operational Data Stores (ODS) and distributed system integration.
• Experience with Docker, Git, CI/CD tools, and automated testing frameworks.
• Experience working in Agile environments and collaborating with cross-functional
teams.
• Comfortable producing and maintaining clear technical documentation.
• Working knowledge of React is acceptable but not a focus.
• Hands-on experience working with Databricks or similar data platforms.
Education & Certifications:
• Bachelor’s degree in Computer Science, Engineering, or a related field (required).
• Master’s degree is a plus.
• Certifications in AWS (e.g., Developer Associate, Solutions Architect) or Python
frameworks are highly preferred.
🧑💼 Position: Telesales Executive
📍 Location: New Delhi – 110044
💼 Industry: BPO / Telesales / Voice Process
🗣️ Language Requirement: Bengali (Fluent)
✅ Key Requirements:
- Minimum Qualification: 12th Pass (Higher Secondary)
- Experience: Minimum 1 year of experience in an outbound sales process
- Strong communication and convincing skills
- Excellent verbal clarity and diction
- Ability to handle high call volumes and customer interaction confidently
⭐ Preferred Skills:
- Fluency in Bengali is mandatory
- Prior experience in BPO voice processes is preferred
💰 Compensation:
- ₹25,000 CTC + Attractive Incentives
- Performance-based incentive structure
- Design patterns- socket communication/ micro services architecture
- Caching - REDIS MEMCACHED etc
- Database - MONGO, SQL etc. min 2yrs experience with these.
- Features created for large concurrent requests.
- Node Js, Go lang any asynchronous programming language min 2 yrs exp with any one of them.
- Message Queues - RABBIT MQ, Kafka etc.
1) Candidate must have knowledge in the testing domain and must have knowledge of testing of the web-based application
2)Understand requirements, design exhaustive test scenarios, execute manual test cases, dig deeper into issues, identify root causes and articulate defects clearly
3)Strive for excellence in quality by looking beyond obvious scenarios and stated requirements and by keeping end user needs in mind
4)Collaborate with engineering team and product management to elicit & understand their requirements and develop potential solutions
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.
Fresher Most Welcome (Education Background – Marketing only)
Must have Good Knowledge with Marketing
Must have Knowledge with Product Sales
Experience with Product Sales or Marketing (Plus Point)
Must have - Strong python
Skills required-
-Candidate have to be available in Bangalore before joining
-Fluency in English
-Graduation is mandatory
Responsibilities-
Conversion of leads received through various marketing channels.
• Preparing short-term and long-term #sales plan towards reaching the assigned goals.
• Consistently achieve revenue targets in line with team/organizational objectives.
• Proactively identifying cross-selling/ upselling opportunities with the existing
• Identifying references through the existing customer base to increase the sales pipeline.
Job description
Are you a student interested in building real-world graphic design experience with a cutting edge forward thinking business? Synapsica is looking for a talented and knowledgeable designer with fresh, creative ideas and an excellent eye for detail.
We are a B2B SaaS Healthcare startup working with leading Radiologists and Diagnostic Centers innovating in the realms of AI/ML, Radiology, and Digital Transformation. With a clientele comprising some leading healthcare brands across India and US, were striving to fast-track innovation across the healthcare industry. Join us to work on intelligent, witty, trendy and new age designs that will shape the Medical technological landscape through the next decade!
(Visit https://www.synapsica.com/ to know more about the company.)
Graphic Design Intern Duties and Responsibilities
- Create and design all web page elements, wireframes, print and digital graphics
- Ensure projects are completed with high quality and on schedule
- Adhere to brand guidelines and complete projects according to deadline
- Prioritize and manage multiple projects within design specifications and budget restrictions
- Perform retouching and manipulation of images
- Work with a wide range of media and use graphic design software
- Collaborate with the marketing team to design concepts
- Creation and updates to various sales collateral
- Assist with video production (shooting, editing, etc)
- Assemble final presentation material for printing as needed
Requirements
- Graphic Design major preferred
- Basic knowledge of layouts, typography, line composition, color, and other graphic design fundamentals
- Experience with Adobe CC (After effects, Indesign, Photoshop and Illustrator)
- Strong creative and analytical skills
- Compelling portfolio of graphic design work
- Organized, dependable and detail oriented
- Team Player
- Quick Learner and Efficient
- High sense of urgency






