
Work Experience: 1-3 years
Work Location: Currently Remote – (WFH)
(Post re-opening of offices from any office location across Gurgaon /
Noida / Bangalore)
Requirements:
Key Responsibilities:
• Demonstrated proficiency with at least one development tool /
technology.
• Experience with web application design and analysis, design patterns, and
object-oriented design
• Experience in multiple web technologies including: XML, HTML, CSS, AJAX
/ JavaScript, Web Services/SOAP, SQL
• Experience with one or more Java concepts and patterns including:
Java/J2EE, JSP, Spring, Sling, JMS, JUnit, AOP, MVC, Eclipse
• Working knowledge of multiple web and application tier technologies:
Tomcat, WebSphere, WebLogic, Apache Http, Spring tcServer, Solr, opensource packages
• Experience with multiple source control systems: CVS, SVN, Git
• Experience in a consulting environment with demonstrated track record of
continuing responsibilities, creativity and innovation
Minimum Requirements:
• Minimum of 3 years of hands-on software development experience.
• Minimum of 2years of full life cycle implementation experience using various SDLC methodologies.
• Minimum of 1year of developing eCommerce applications, specifically Salesforce Commerce Cloud B2C.
• Bachelor's degree or equivalent (minimum 12 years) work experience.
Professional Attributes:
• Produce quality, on-budget, and on-schedule solutions on projects.
• Technically sound Understanding of platform-specific architecture
• Proficient in effectively documenting complex functionality in code and
documentation
• Conduct development on the SFCC platform using a multi-site SFRA
architecture.
• Plan, design, build, and review content-managed, usable, and accessible,
Interested candidates can share updated CVs9sss717757sr

Similar jobs
Job Title : Oracle Analytics Cloud (OAC) / Fusion Data Intelligence (FDI) Specialist
Experience : 3 to 8 years
Location : All USI locations – Hyderabad, Bengaluru, Mumbai, Gurugram (preferred) and Pune, Chennai, Kolkata
Work Mode : Hybrid Only (2-3 days from office or all 5 days from office)
Mandatory Skills : Oracle Analytics Cloud (OAC), Fusion Data Intelligence (FDI), RPD, OAC Reports, Data Visualizations, SQL, PL/SQL, Oracle Databases, ODI, Oracle Cloud Infrastructure (OCI), DevOps tools, Agile methodology.
Key Responsibilities :
- Design, develop, and maintain solutions using Oracle Analytics Cloud (OAC).
- Build and optimize complex RPD models, OAC reports, and data visualizations.
- Utilize SQL and PL/SQL for data querying and performance optimization.
- Develop and manage applications hosted on Oracle Cloud Infrastructure (OCI).
- Support Oracle Cloud migrations, OBIEE upgrades, and integration projects.
- Collaborate with teams using the ODI (Oracle Data Integrator) tool for ETL processes.
- Implement cloud scripting using CURL for Oracle Cloud automation.
- Contribute to the design and implementation of Business Continuity and Disaster Recovery strategies for cloud applications.
- Participate in Agile development processes and DevOps practices including CI/CD and deployment orchestration.
Required Skills :
- Strong hands-on expertise in Oracle Analytics Cloud (OAC) and/or Fusion Data Intelligence (FDI).
- Deep understanding of data modeling, reporting, and visualization techniques.
- Proficiency in SQL, PL/SQL, and relational databases on Oracle.
- Familiarity with DevOps tools, version control, and deployment automation.
- Working knowledge of Oracle Cloud services, scripting, and monitoring.
Good to Have :
- Prior experience in OBIEE to OAC migrations.
- Exposure to data security models and cloud performance tuning.
- Certification in Oracle Cloud-related technologies.
Job Title: Full Stack Engineer
Location: Delhi-NCR
Type: Full-Time
Responsibilities
Frontend:
- Develop responsive, intuitive interfaces using HTML, CSS (SASS), React, and Vanilla JS.
- Implement real-time features using sockets for dynamic, interactive user experiences.
- Collaborate with designers to ensure consistent UI/UX patterns and deliver visually compelling products.
Backend:
- Design, implement, and maintain APIs using Python (FastAPI).
- Integrate AI-driven features to enhance user experience and streamline processes.
- Ensure the code adheres to best practices in performance, scalability, and security.
- Troubleshoot and resolve production issues, minimizing downtime and improving reliability.
Database & Data Management:
- Work with PostgreSQL for relational data, ensuring optimal queries and indexing.
- Utilize ClickHouse or MongoDB where appropriate to handle specific data workloads and analytics needs.
- Contribute to building dashboards and tools for analytics and reporting.
- Leverage AI/ML concepts to derive insights from data and improve system performance.
General:
- Use Git for version control; conduct code reviews, ensure clean commit history, and maintain robust documentation.
- Collaborate with cross-functional teams to deliver features that align with business goals.
- Stay updated with industry trends, particularly in AI and emerging frameworks, and apply them to enhance our platform.
- Mentor junior engineers and contribute to continuous improvement in team processes and code quality.
Role Overview:
We are seeking a Senior Software Engineer (SSE) with strong expertise in Kafka, Python, and Azure Databricks to lead and contribute to our healthcare data engineering initiatives. This role is pivotal in building scalable, real-time data pipelines and processing large-scale healthcare datasets in a secure and compliant cloud environment.
The ideal candidate will have a solid background in real-time streaming, big data processing, and cloud platforms, along with strong leadership and stakeholder engagement capabilities.
Key Responsibilities:
- Design and develop scalable real-time data streaming solutions using Apache Kafka and Python.
- Architect and implement ETL/ELT pipelines using Azure Databricks for both structured and unstructured healthcare data.
- Optimize and maintain Kafka applications, Python scripts, and Databricks workflows to ensure performance and reliability.
- Ensure data integrity, security, and compliance with healthcare standards such as HIPAA and HITRUST.
- Collaborate with data scientists, analysts, and business stakeholders to gather requirements and translate them into robust data solutions.
- Mentor junior engineers, perform code reviews, and promote engineering best practices.
- Stay current with evolving technologies in cloud, big data, and healthcare data standards.
- Contribute to the development of CI/CD pipelines and containerized environments (Docker, Kubernetes).
Required Skills & Qualifications:
- 4+ years of hands-on experience in data engineering roles.
- Strong proficiency in Kafka (including Kafka Streams, Kafka Connect, Schema Registry).
- Proficient in Python for data processing and automation.
- Experience with Azure Databricks (or readiness to ramp up quickly).
- Solid understanding of cloud platforms, with a preference for Azure (AWS/GCP is a plus).
- Strong knowledge of SQL and NoSQL databases; data modeling for large-scale systems.
- Familiarity with containerization tools like Docker and orchestration using Kubernetes.
- Exposure to CI/CD pipelines for data applications.
- Prior experience with healthcare datasets (EHR, HL7, FHIR, claims data) is highly desirable.
- Excellent problem-solving abilities and a proactive mindset.
- Strong communication and interpersonal skills to work in cross-functional teams.
1.Pre-Flight Inspection:
● Conduct thorough pre-flight checks on the drone, including battery levels, sensors, propellers, and GPS calibration.
● Ensure all firmware and software are up to date for both the drone and remote controller.
● Verify that the drone is in optimal operating condition, including payload attachments, if applicable.
2.Flight Planning:
● Plan the flight route based on mission objectives, considering factors like altitude, weather conditions, and airspace regulations.
3.Safe and Efficient Flight Operations:
● Monitor drone telemetry data during flight (e.g., battery level, GPS signal, altitude, and speed).
● Ensure real-time communication with ground personnel and provide situational updates as necessary.
4.Emergency Procedures:
● Be prepared to handle emergencies such as low battery, lost GPS signal, or malfunctioning sensors.
● Execute safe and controlled emergency landings if required.
5.Post-Flight Data Handling:
● Safely retrieve the drone and perform a post-flight inspection, checking for any damage or malfunctions.
● Offload data collected during the flight, ensuring secure storage and proper labeling.
● Provide a detailed flight report, documenting any issues, flight conditions, and mission outcomes.
Minimum Requirement’s :
● must be 18+
● Must have at least 50hrs Flying experience.
● Should have knowledge of basic photography to handle camera settings for manual mode.
● Should have good smooth hand manipulation control with camera moments.
● Can keep his control calm in panic situations.
● A drone pilot should be familiar with drone hardware and software, including drone flight controls, GPS systems, and camera setups.
● Can take rapid decisions (depending on situations)
Job Description :
A candidate who has a strong background in the design and implementation of scalable architecture and a good understanding of Algorithms, Data structures, and design patterns. Candidate must be ready to learn new tools, languages, and technologies
Skills :
Microservices, MySQL/Postgres, Kafka/Message Queues, Elasticsearch, Data pipelines, AWS Cloud, Clickhouse/Redshift
What you need to succeed in this role
- Minimum 6 years of experience
- Good understanding of various database types: RDBMS, NoSQL, GraphDB, etc
- Ability to build highly stable reliable APIs and backend services.
- Should be familiar with distributed, high availability database systems
- Experience with queuing systems like Kafka
- Hands-on in cloud infrastructure AWS/GCP/Azure
- Highly plus if know one or more of the following: Confluent ksql, Kafka connect, Kafka streams
- Hands-on experience with data warehouse/OLAP systems such as Redshift, click house and added plus.
- Good communication and interpersonal skills
Benefits of joining us
- Ability to join a small and growing team, and work with some of the coolest people you've ever met
- Opportunity to make an impact, and leave your mark on this organization.
- Competitive compensation, with the ability to shape your own career trajectory
- Go Extra Mile with Learning and Development
What can you look for?
A wholesome opportunity in a fast-paced environment that will enable you to juggle between concepts, yet maintain the quality on content, interact and share your ideas and have loads of learning while at work. Work with a team of highly talented young professionals and enjoy the benefits of being at Xoxoday.
We are
A fast-growing SaaS commerce company based in Bangalore with offices in Delhi, Mumbai, SF, Dubai, Singapore and Dublin. We have three products in our portfolio: Plum, Empuls and Compass. Works with over 1000 global clients. We help our clients in engaging and motivating their employees, sales teams, channel partners or consumers for better business results.
- Develop creative solutions and write technical design based on the requirements.
- Work closely with peer teams to ensure that applications are written to allow for overall system performance.
- Assist in tuning and optimization.
- Develop and execute unit tests for product components.
- Perform peer code reviews and provide feedback.
- Promote high quality, scalability, and timely completion of projects.
- Apply agile approach to coordinate development and determine project scope and limitations.
- Bachelor degree in Engineering, CS or equivalent experience
- Development in Core Java, J2EE, Struts, spring, Client-Side scripting, Hibernate, Database
- Development of scalable core-java applications, Web Applications and Web Services
- OOP concepts
- Data Structures, algorithms and their applications
- Strong problem-solving skills
- Experience in building microservices
- Strong experience with Spring Boot stack (spring cloud, spring-data)
- Extensive experience in developing and consuming REST APIs
- Experience in Kafka distributed messaging
- Hands-on experience in Redis, Apache Ignite, Hazelcast
- Strong experience in RDMS and NoSQL databases Mongo
- Experience in using Elastic Search
- Experience in profiling applications
- Strong analytical skills and general logical reasoning
- Excellent written and verbal communication skills.
- Good understanding of Software development life cycle (SDLC)
- Basic SQL queries













