
Similar jobs
Review Criteria:
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Role & Responsibilities:
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
About Us
21 Knots is a design, engineering, and consulting firm providing services to the global maritime and oil & gas industry. In an industry constantly evolving due to dynamic regulations, economic fluctuations, and climate change mandates, we strive to deliver cutting-edge solutions with an unwavering commitment to excellence. Our comprehensive services are designed to create value for our esteemed clients while enabling them to achieve their business goals.
The Role
We’re looking for a skilled and motivated Software Developer to join our team at 21 Knots. Someone with backend development with a working knowledge of frontend technologies. The ideal candidate will have a strong command of backend frameworks such as Django, FastAPI, or Flask, and familiarity with frontend tools like HTML, CSS, JavaScript, and optionally React, Angular, or Vue. You will play a key role in building, maintaining, and optimizing scalable software solutions while collaborating with a team
Responsibilities
- Develop,test, and maintain backend services using Django, FastAPI, or Flask.
- Design and implement RESTful APIs for web and internal tools.
- Work with relational and non-relational databases such as PostgreSQL, MySQL, or MongoDB.
- Optimize application performance and implement backend security best practices.
- Collaborate with frontend developers, designers, and cross-functional teams to deliver high-quality solutions.
- Write clean, maintainable, and well-documented code.
- Support basic frontend development tasks using HTML, CSS, and JavaScript.
- Work with modern frontend frameworks like React.js, Angular, or Vue (good to have).
What You’ll Need
Experience:
- 1–4 years of hands-on experience in backend development.
- Exposure to full-stack development environments.
- Experience working with RESTful APIs, databases, and cloud integration is a plus.
Education:
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
Tools & Software Proficiency:
- Proficiency in Django, FastAPI, or Flask.
- Familiarity with PostgreSQL, MySQL, or MongoDB.
- Basic knowledge of frontend technologies: HTML, CSS, JavaScript.
- Exposure to Bootstrap or Tailwind CSS for responsive UI design.
- Good to have: experience with React, Angular, or Vue.
- Familiarity with cloud platforms (AWS, Azure, or GCP) and Agile/Scrum methodologies is an added advantage.
Skills & Competencies:
- Strong problem-solving and debugging skills.
- Understanding of RESTful architecture and secure coding practices.
- Ability to collaborate in a cross-functional environment.
- Good verbal and written communication skills.
1. Understanding UI testing
2. Interacting with the clients
3. Creating test documents
4. Performing manual testing
1. Basic HTML, CSS, good aptitude, analytical & problem-solving skills
2. The candidate having a software testing course certificate or experience in testing would be preferred
3. Excel Sheet, knowledge of SDLC, Scrum and Agile methodology
The ideal candidate is a self-motivated, multi-tasker, and demonstrated team player. You will be a lead developer responsible for the development of new software products and enhancements to existing products. You should excel in working with large-scale applications and frameworks and have outstanding communication and leadership skills.
RESPONSIBILITIES:
- Writing clean, high-quality, high-performance, maintainable code.
- Develop and support software including applications, database integration, interfaces, and new functionality enhancements.
- Coordinate cross-functionally to ensure the project meets business objectives and compliance standards.
- Support test and deployment of new products and features.
- Participate in code reviews.
KEY SKILLS AND QUALIFICATIONS REQUIRED: - A minimum of 3+ years of relevant work experience in the development of scalable scale web applications.
- Proven experience in Full stack web development using Typescript, Angular1,2,4+, HTML5 and CSS3, Microservice API development, and experience in PHP/Python/Laravel and MVC frameworks is a must.
- Exposure to CI/CD tools, Code Analysis, and Test automation is preferred.
- Expertise in Object-Oriented Design, Database Design, and XML Schema.
- Experience with Agile or Scrum software development methodologies.
- Ability to multi-task, organize, and prioritize work.
- Knowledge of advanced PHP concepts, MySQL, JSON, XML & ability to work in a LAMP/WAMP environment.
- Strong Knowledge of DB & code security along with basic server operations like Curl, Crud, cPanel.
● You’ve been building scalable backend solutions for web applications.
● You have experience with any of these backend programming languages -- Python,
NodeJS or Java.
● You write an understandable, very high quality, testable code with an eye towards
maintainability.
● You are a strong communicator. Explaining complex technical concepts to designers,
support, and other engineers is no problem for you.
● You possess strong computer science fundamentals: data structures, algorithms,
programming languages, distributed systems, and information retrieval.
● You have completed a bachelor's degree in Computer Science, Engineering or related
field, or equivalent training, fellowship, or work experience.
- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub.
- Experience in migrating on-premise data warehouses to data platforms on AZURE cloud.
- Designing and implementing data engineering, ingestion, and transformation functions
- Experience with Azure Analysis Services
- Experience in Power BI
- Experience with third-party solutions like Attunity/Stream sets, Informatica
- Experience with PreSales activities (Responding to RFPs, Executing Quick POCs)
- Capacity Planning and Performance Tuning on Azure Stack and Spark.
Required Skill Set: Microservices, Spring boot framework, Core, Lambdas, and Advanced Java.
Job Description:
1) 3-8 years of Java development experience.
2) 4+ years of microservice development experience with Spring boot framework.
3) Core & Advanced Java (Threading, Design Patterns, Data Structures) J2EE, REST web services.
4) Excellent knowledge of Enterprise Design Patterns.
5) Full stack development with Angular 8 experience will be a plus.
6) Experience with test-driven software development.
7) Exposure to the telecom domain.
8) ETOM/ SID which makes the TM Forum framework.
9) Decent communication skills.
Location: Panchkula, Mohali, Gurugram, Bangalore, Chennai, Pune, Dehradun.
About InVideo:
InVideo is the next-gen video creation platform. We believe the future of video creation is in the Browser, Across Device, Collaborative & Easy. Since the launch of InVideo in April 2019, InVideo has acquired 200K+ users from 150+ countries. InVideo has raised over $3.5M and is backed by marquee funds and angels like Sequoia Surge, Ryan Hoover (CEO Product Hunt), Gokul Rajaram, Haresh Chawla, Anand C, Kunal Shah, Kunal Bahl, etc.
Responsibilities
- Working with JSON for Template QA and publishing.
- Having a good eye for detail to point out bugs missed by the designers.
- Maintaining CMS for category tagging, release dates, front-end trays, etc.
- Working with the engineering team to seek out bugs, and get them fixed.
- Translating CSVs and other data from Amplitude, CleverTap, SQL etc. into readable sheets for the Design Team.
Experience
- Fresher or 1-2 years of experience in QA testing
- Bachelors degree in IT, CS, B-Tech or similar fields
Required Skills
- Knowledge of SQL
- Attention to detail
Good to have
- Understanding of JSON
- Knowledge of Microsoft Excel/Google Sheets
Hi,
Greetings from LMV Financial Services Pvt Ltd
we have a immediate requirements for Female Telesales officer @Somajiguda
JOB DESCRIPTION:
✅Should have good communication skills and strong interpersonal skills.
✅Responsible for making outbound calls and regular followup on leads.
✅Understand the customer requirements, encourage them to upgrade loans.
⏩Attractive incentives upon successful sales..
⏩Data will be provided by the company.
Required candidate profile:
Position: Telesales officer
Eligible criteria: Any graduate
Gender: Female Preferred
Salary: 10000+ Incentives









