
Job Title: Junior Business Analyst with Project Coordination Experience
Key Responsibilities:
- Analyze business processes, gather and document requirements, and provide solutions to improve efficiency.
- Assist in project planning, scheduling, and resource management.
- Coordinate project activities, timelines, and communication between teams.
- Support project managers in monitoring project progress, identifying risks, and ensuring project goals are met.
- Create and maintain project documentation, including business requirements, project plans, and status reports.
- Collaborate with stakeholders to ensure alignment between business goals and project deliverables.
- Assist with testing and validation of solutions to ensure they meet business needs.

About Nebula Technologies
About
Company social profiles
Similar jobs
Review Criteria:
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Role & Responsibilities:
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
We are looking for a trainer to deliver an advanced AI Training for Oracle Services, specifically leveraging the licensed tool Claude.
The scope of the training should cover the following topics:
- EBS Form Designing using AI
- APEX Form Designing using AI
- Report Generation (.rdf, Fusion BIP) using AI
- Report Bursting using AI
- PaaS/VBCS Form Personalization using AI tools
- Fusion Form Customization using Groovy Script and AI
- EBS to Fusion Form Conversion
- APEX to Fusion Form Conversion
The expected number of participants is 15, and the training is preferred to be conducted offline.
Minimum requirements
5+ years of industry software engineering experience (does not include internships nor includes co-ops)
Strong coding skills in any programming language (we understand new languages can be learned on the job so our interview process is language agnostic)
Strong collaboration skills, can work across workstreams within your team and contribute to your peers’ success
Have the ability to thrive on a high level of autonomy, responsibility, and think of yourself as entrepreneurial
Interest in working as a generalist across varying technologies and stacks to solve problems and delight both internal and external users
Preferred Qualifications
Experience with large-scale financial tracking systems
Good understanding and practical knowledge in cloud based services (e.g. gRPC, GraphQL, Docker/Kubernetes, cloud services such as AWS, etc.)
**Job Title**: UX Designer
**Location**: Remote
**Job Type**: Freelancing
**Introduction**:
We are seeking a creative and detail-oriented UX Designer to join our team. The ideal candidate will have a strong portfolio of successful UX and other technical projects and will be responsible for designing the overall functionality of the product and ensuring a great user experience.
**Responsibilities**:
– Translate concepts into user flows, wireframes, mockups, and prototypes that lead to intuitive user experiences.
– Facilitate the client’s product vision by researching, conceiving, sketching, prototyping, and user-testing experiences for digital products.
– Design and deliver wireframes, user stories, user journeys, and mockups optimized for a wide range of devices and interfaces.
– Identify design problems and devise elegant solutions.
– Make strategic design and user-experience decisions related to core, and new, functions and features.
– Take a user-centered design approach and rapidly test and iterate your designs.
– Collaborate with other team members and stakeholders.
– Ask smart questions, take risks, and champion new ideas.
**Qualifications**:
– Three or more years of UX design experience. Preference will be given to candidates who have experience designing complex solutions for complete digital environments.
– Expertise in standard UX software such as Sketch, OmniGraffle, Axure, InVision, UXPin, Balsamiq, Framer, and the like is a must. Basic HTML5, CSS3, and JavaScript skills are a plus.
– Ability to work with clients to understand detailed requirements and design complete user experiences that meet client needs and vision.
– Extensive experience in using UX design best practices to design solutions, and a deep understanding of mobile-first and responsive design.
– A solid grasp of user-centered design (UCD), planning and conducting user research, user testing, A/B testing, rapid prototyping, heuristic analysis, usability and accessibility concerns.
– Ability to iterate designs and solutions efficiently and intelligently.
– Ability to clearly and effectively communicate design processes, ideas, and solutions to teams and clients.
– A clear understanding of the importance of user-centered design and design thinking.
– Ability to work effectively in a team setting including synthesizing abstract ideas into concrete design implications.
**Benefits**:
– Competitive salary and comprehensive benefits package.
– Opportunities for professional development and career growth.
– Flexible working hours, including provisions for remote work.
– A creative and inclusive work environment.
Samsan Technologies is hiring automotive professionals for Bangalore locations
We are looking for professional Linux/Bootloader Developers to execute complete Linux/Bootloader Development.
Experience: 4-8 Years
Develop Bootloaders for RISC-V boards
Enable Secure Boot for RISC-V system
Development of Firmware update functionality
Development of Linux BSP & Drivers Preferred requirements:
5+ years of experience in board bring-up, Bootloader development (uefi/edk2/SecureBoot)
Experience in RISC-V ISA is highly desirable.
Proficiency in C/C++
• Strong leadership skills; 3 years experience leading development teams.
• Experience architecting technical designs based on functional and business requirements.
• Project Management skills
• Exceptional communication skills, verbal and written
• Should be able
• To produce high quality technical documentation for our customers
• To decompose technical tasks and provide accurate estimates
• To coach less experienced members of your team
• To undertake development tasks without supervision, including software designing, programming and Unit testing
• To monitor progress and provide updates to the Project Manager
Technical Skills : Node.js, Restful, Express, React.js, Redux, MongoDb
- Previous working experience as a MEAN/MERN Stack Developer for at least 2 years.
- In depth knowledge of React.js, Angular.js, NodeJS, ExpressJS.
- Experience implementing applications using React.js and Angular.js.
- Experience creating front end applications using HTML, React and Angular.
- Hands on experience with JavaScript Development on both client and server-side
Preferred Qualifications & Desired Competencies:
- Can-do attitude and bring a go-getter
- Ability to learn new technologies quickly.
- Self-motivated, results-driven individual and passionate about work.
- Excellent communication skills and problem solving skills
Pipelines should be optimised to handle both real time data, batch update data and historical data.
Establish scalable, efficient, automated processes for complex, large scale data analysis.
Write high quality code to gather and manage large data sets (both real time and batch data) from multiple sources, perform ETL and store it in a data warehouse.
Manipulate and analyse complex, high-volume, high-dimensional data from varying sources using a variety of tools and data analysis techniques.
Participate in data pipelines health monitoring and performance optimisations as well as quality documentation.
Interact with end users/clients and translate business language into technical requirements.
Acts independently to expose and resolve problems.
Job Requirements :-
2+ years experience working in software development & data pipeline development for enterprise analytics.
2+ years of working with Python with exposure to various warehousing tools
In-depth working with any of commercial tools like AWS Glue, Ta-lend, Informatica, Data-stage, etc.
Experience with various relational databases like MySQL, MSSql, Oracle etc. is a must.
Experience with analytics and reporting tools (Tableau, Power BI, SSRS, SSAS).
Experience in various DevOps practices helping the client to deploy and scale the systems as per requirement.
Strong verbal and written communication skills with other developers and business client.
Knowledge of Logistics and/or Transportation Domain is a plus.
Hands-on with traditional databases and ERP systems like Sybase and People-soft.







