
Roles & Responsibilities:
- Define and lead strategic performance projects, manage stakeholders to drive outcomes.
- Analyzing and presenting key MIS reports for Management review
- Lead business planning for including formulation of annual, monthly and daily plans on key
- performance metrics.
- Driving the strategic initiatives of the company as per the CEO's agenda
- Studying & improvising the operation processes of the company.
- Drive business performance insights working closely with the sales, marketing, growth &analytics teams
- Create strategic value-added analyses around growth opportunities, retention levers and core
- business metrics in order to drive better decision making and results
- Conduct ongoing analysis of key business drivers, trends and performance.

Similar jobs
Review Criteria
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred
- Preferred (Nice-to-have) – Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Job Specific Criteria
- CV Attachment is mandatory
- How many years of experience you have with Dremio?
- Which is your preferred job location (Mumbai / Bengaluru / Hyderabad / Gurgaon)?
- Are you okay with 3 Days WFO?
- Virtual Interview requires video to be on, are you okay with it?
Role & Responsibilities
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate
- Bachelor’s or master’s in computer science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Informatica ETL Developer
Job Description
Having experience in Informatica & SQL, Data Migration, Data Load & Management
Having hands on experience on SQL queries.
Having good knowledge and working experience in Informatica ETL tool right from developing ETL to scheduling/ Monitoring it.
Having experience in writing/ debugging complex SQL queries/ stored procedures.
Ability to build, test, and maintain ETL
Knowledge and Experience in GIT version control application.
Good to have knowledge on PostgreSQL and Amazon S3.
Create, modify, document, and enhance data specifications including identifying and mapping data elements from source systems into the dimensional data warehouse and operational system; Validate that the operational system and data warehouse meets specified requirements.
Analyze and QA data from different sources and ensure that it is properly loaded into our systems
Investigate data quality issues and implement appropriate solutions
Ability to manage tasks and coordinates with other team members and supervisor to meet such timelines and actively contributes to project status reporting processes to identify and mitigate risks.
Requirement gathering – frequent interaction with the Data & Technical Managers, Database administrators, Application Users – all belonging to different departments and divisions of to ensure smooth implementation and running of the application
Knowledge of health system functions, terminology and standard coding systems preferred.
Qualifications
Bachelor's degree in computer science engineering, mathematics, statistics, science, information technology, Business Administration required Or Master Degree in computer science, computer application or Business Administration preferred
Min 3 years of extensive experience in Informatica ETL design & development, Data Migration, Data Load & Management, SQL Queries, T-SQL programming & testing
Additional information
Good written and verbal communication skills
Good client interaction experience
Good team player
Total Exp – 5 years out of which min 3 years should be relevant.
- You're proficient in React.JS and strong frontend javascript foundation
- Knowing Web3.js integration is a definite plus but not a requirement
- You have a passion for writing code as well as understanding and crafting the ways systems interact
- You have experience deploying to and implementing solutions in AWS
- You believe in the benefits of agile processes and shipping code often
- You are pragmatic and work to coalesce requirements into reasonable solutions that provide value
Responsibilities
- Deploy well-tested, maintainable and scalable software solutions
- Take end-to-end ownership of the technology stack and product
- Collaborate with other engineers to architect scalable technical solutions
- Embrace and improve our standards and processes to reduce friction and unlock efficiency
Current Ecosystem :
ShibaSwap : https://shibaswap.com/#/" target="_blank">https://shibaswap.com/#/
Metaverse : https://shib.io/#/" target="_blank">https://shib.io/#/
NFTs : https://opensea.io/collection/theshiboshis" target="_blank">https://opensea.io/collection/theshiboshis
Game: Shiba Eternity on iOS and Android
Android Developer
JD:-
We are looking for an Android Developer who possesses a passion for pushing mobile technologies to the limits. This Android app developer will work with our team of talented engineers to design and build the next generation of our mobile applications. Android programming works closely with other app development and technical teams.
Responsibilities:
- Design and build advanced applications for the Android platform
- Collaborate with cross-functional teams to define, design, and ship new features Work with outside data sources and APIs
- Unit-test code for robustness, including edge cases, usability, and general reliability • Work on bug fixing and improving application performance
- Continuously discover, evaluate, and implement new technologies to maximize development efficiency
Requirements:
- Proven software development experience and Android skills development • Proven working experience in Android app development and
- Have published at least one original Android app
- Experience with Android SDK
- Experience working with remote data via REST and JSON
- Experience with third-party libraries and APIs
- Working knowledge of the general mobile landscape, architectures, trends, and emerging technologies
- Strong technical knowledge in software, hardware, and networking
- Prior experience in Customer service or call center environment
- IP protocol and network experience highly desirable
- Working knowledge of NetApp technical systems (6 months or more)
- Strong understanding of computing technology including hardware components, data storage, operating systems, software applications, common peripheral devices, and external connectivity
- Sound problem solving skills with linear and logical troubleshooting skills
- Process knowledge, assessment, design and documentation skills
- Strong oral and written communication skills
- Solid analytical, technical, and project management skills
- Must have proficiency with various software applications including Microsoft Office (Word, Excel, PowerPoint, Outlook)
- Ability to work independently with minimal supervision
- Must be available weekends
- Languages: B2 English
- Develop test plans alongside product designers and product engineers.
- Drive improvements to test framework architecture and test coverage.
- Drive adoption of best practices in code health, testing, testability, and maintainability (clean code, test pyramid).
- Track test gaps, quality, and productivity metrics. Work with other engineering teams to improve gaps from this data.
- Recommend improvements into overall best practices, design, testability and quality, and productivity.
Requirements
- 8+ years' experience of working in the Quality Engineering field
- 3+ Years of experience with UI & API test automation tools, (Selenium and etc.)
- 4+ years of experience in leading a team
- Proficient in TypeScript, Go, or a related programming language
- Experience with Continuous Integration systems (e.g., GitLab)
- Experience with AWS, Docker, and Kubernetes
About You
- A+ character. We are team-first here at the company.
- A hard-working mentality. It’s early and there is still a lot to build.
- An excellent communicator.
- A fun attitude. Life’s too short. We can have fun while we work hard on cool things.
- Smarts. We need people that are smart enough to make decisions on their own and also smart enough to know when they need input from others.
Perks & Benefits
- Competitive salary and benefits
- Group Medical insurance
- Life and long-term disability insurance
- Collaborative workspace
- The opportunity to join the fastest growing start-up alongside a team of motivated and driven individuals
What you’ll do:
- Translation of text from English to Odia
- Contribute to Word pool of Odia language database.
- Content creation.
- Research work on Odia language contents, web pages, applications.
- Testing and enhancement of proprietary language technologies
- Development of internal tools
- Maintaining records of work received and work performed etc.
Who you are:
- Graduation/Post graduation in language, Journalism or Mass Communication
- Diploma in Translation will be an added advantage
- Excellent command over English and Odia language
- Working knowledge of text entry in Odia language
- Working experience on Unicode text.
- Good knowledge of computer usage including MS Word and Excel
- Experience in translation/copy-writing or creative writing









