11+ SPARQL Jobs in India
Apply to 11+ SPARQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SPARQL Jobs and apply today!
ROLES AND RESPONSIBILITIES:
You will be responsible for architecting, implementing, and optimizing Dremio-based data Lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
IDEAL CANDIDATE:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
PREFERRED:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
Candidate must have prior experience in B2B SaaS or Cybersecurity.
Candidate must have a minimum of 3+ years of experience selling to the US market.
Candidate must have a proven track record of closing mid-market and enterprise-level deals in a SaaS environment.
Candidate must have experience managing and achieving $300K–$500K annual revenue targets.
Candidate must have strong exposure to the full sales cycle, with emphasis on outbound acquisition and pipeline ownership.
Candidate must be a high-agency, self-driven operator with consistent quota attainment.
Candidate should not have any employment gap longer than 3 months
Job Description:
- He / She candidate must possess a strong technology background with advanced knowledge of Java and Python based technology stack.
- Java, JEE, Spring MVC, Python, JPA, Spring Boot, REST API, Database, Playwright, CI/CD pipelines
- * At least 3 years of Hand-on Java EE and Core Java experience with strong leadership qualities.
- * Experience with Web Service development, REST and Services Oriented Architecture.
- * Expertise in Object Oriented Design, Design patterns, Architecture and Application Integration.
- * Working knowledge of Databases including Design, SOL proficiency.
- * Strong experience with frameworks used for development and automated testing like SpringBoot, Junit, BDD etc.
- * Experience with Unix/Linux Operating System and Basic Linux Commands.
- * Strong development skills with ability to understand technical design and translate the same into workable solution.
- * Basic knowledge of Python and Hand-on experience on Python scripting
- * Build, deploy, and monitor applications using CI/CD pipelines, * Experience with agile development methodology.
- Good to Have - Elastic Index Database, MongoDB. - No SQL Database Docker Deployments, Cloud Deployments Any Al ML. snowflake Experience
Responsibilities:
- Design, develop, and implement robust and efficient backend services using microservices architecture principles.
- Write clean, maintainable, and well-documented code using C# and the .NET framework.
- Develop and implement data access layers using Entity Framework.
- Utilize Azure DevOps for version control, continuous integration, and continuous delivery (CI/CD) pipelines.
- Design and manage databases on Azure SQL.
- Perform code reviews and participate in pair programming to ensure code quality.
- Troubleshoot and debug complex backend issues.
- Optimize backend performance and scalability to ensure a smooth user experience.
- Stay up-to-date with the latest advancements in backend technologies and cloud platforms.
- Collaborate effectively with frontend developers, product managers, and other stakeholders.
- Clearly communicate technical concepts to both technical and non-technical audiences.
Qualifications:
- Strong understanding of microservices architecture principles and best practices.
- In-depth knowledge of C# programming language and the .NET framework (ASP.NET MVC/Core, Web API).
- Experience working with Entity Framework for data access.
- Proficiency with Azure DevOps for CI/CD pipelines and version control (Git).
- Experience with Azure SQL for database design and management.
- Experience with unit testing and integration testing methodologies.
- Excellent problem-solving and analytical skills.
- Ability to work independently and as part of a team.
- Strong written and verbal communication skills.
- A passion for building high-quality, scalable, and secure software applications.
Senior Big Data Engineer
Note: Notice Period : 45 days
Banyan Data Services (BDS) is a US-based data-focused Company that specializes in comprehensive data solutions and services, headquartered in San Jose, California, USA.
We are looking for a Senior Hadoop Bigdata Engineer who has expertise in solving complex data problems across a big data platform. You will be a part of our development team based out of Bangalore. This team focuses on the most innovative and emerging data infrastructure software and services to support highly scalable and available infrastructure.
It's a once-in-a-lifetime opportunity to join our rocket ship startup run by a world-class executive team. We are looking for candidates that aspire to be a part of the cutting-edge solutions and services we offer that address next-gen data evolution challenges.
Key Qualifications
· 5+ years of experience working with Java and Spring technologies
· At least 3 years of programming experience working with Spark on big data; including experience with data profiling and building transformations
· Knowledge of microservices architecture is plus
· Experience with any NoSQL databases such as HBase, MongoDB, or Cassandra
· Experience with Kafka or any streaming tools
· Knowledge of Scala would be preferable
· Experience with agile application development
· Exposure of any Cloud Technologies including containers and Kubernetes
· Demonstrated experience of performing DevOps for platforms
· Strong Skillsets in Data Structures & Algorithm in using efficient way of code complexity
· Exposure to Graph databases
· Passion for learning new technologies and the ability to do so quickly
· A Bachelor's degree in a computer-related field or equivalent professional experience is required
Key Responsibilities
· Scope and deliver solutions with the ability to design solutions independently based on high-level architecture
· Design and develop the big data-focused micro-Services
· Involve in big data infrastructure, distributed systems, data modeling, and query processing
· Build software with cutting-edge technologies on cloud
· Willing to learn new technologies and research-orientated projects
· Proven interpersonal skills while contributing to team effort by accomplishing related results as needed
Responsibilities:
- Planning, estimation, requirement analysis, database designing, wireframing, layout design of the assigned project and other ongoing projects in a team with Project Manager / Developer / Designer.
- Handling coding works or major/complex parts of assigned projects and providing coding help and supervision to team members.
- Maintaining and auditing code/design quality as per set guidelines and standards in assigned projects.
- Testing of completed tasks in ongoing projects in the team before sending updates to reporting authority/client.
- Remove the technical impediments
- Focusing the team to ensure the on-time delivery of the agreed task.
- Reporting progress to the Project Manager.
- Facilitating code/design reuse.
- Training and mentoring of team members.
- Maintaining the skill set matrix of the team.
- Building and maintaining PWA.
- Work closely with the product team to accelerate A/B testing and maximize engagement.
Requirements:
- Excellent knowledge of HTML5, CSS3.
- Excellent knowledge of JavaScript, Jquery.
- Excellent knowledge of JavaScript MVC architecture and OOP programming style.
- Good understanding and experience of working on Vue.js/React.js.
Greetings!!
We are hiring for TAM ( Technical Account Manager) and please find the below details :
CADeploy Engineering Pvt.Ltd (http://www.cadeploy.com/"> www.cadeploy.com ) To give a brief introduction about my Company, we are an emerging MNC Engineering firm with a footprint, serving mid-market and large Organizations in the USA,CANADA,Europe and UK. We provide Mechanical, Civil, Architectural and Structural Engineering solutions in Building Construction, Industrial, Infrastructure, Automotive and Aerospace sectors.
Visit our website http://www.cadeploy.com/">www.cadeploy.com for further details
Job Location : Hyderabad
Shift : Fixed Night Shift
Job Overview
The TAM manages an assigned client base, both engaged and prospective, by acting as a technical and consultative resource that coordinates with other departments within CADeploy to facilitate client needs. The TAM ensures the highest level of client satisfaction through identification of specific requirements, transcribing these into clear communication and follow-through on commitments. The general purpose is to maintain healthy relationships and project performance through continual pulse monitoring with the client. This will include checks on service delivery performance, recognition of special requirements, flagging of risk perceptions, etc. The objective is to build and maintain a strong working affiliation for continuity of account while guiding the technical and managerial execution for efficiency and effectiveness. This role is essential to the growth plan of CADeploy as a foundation to its brand recognition for quality performance.
Responsibilities and Duties
The TAM will be expected to generally execute on the following across an assigned client base:
- Serve as an important point of contact for key clients as well as all internal stakeholders.
- Develop and nurture strong client relationships through top level customer service.
- Know and document client requirements and concerns, understand the details of their business processes and ensure that CADeploy will appropriately meet their evolving needs.
- Provide excellent, regular client communications and responsive, consistent follow-through on all issues and actions.
- Coordinate and assist with the roll-out of CADeploy’s services offering to new clients as required.
- Drive strategic planning and development of improvements and work.
- Handle client requests and assist in the development of quotations/proposals ensuring that CADeploy’s standards are enforced uniformly.
- Facilitate internal project hand-offs to verify that the work performed is complete and meets the client’s, as well as, CADeploy’s standards.
- Provide customized, end-user (staff) communications for all new clients engaged, ensuring a seamless and successful deployment of CADeploy’s services to their specific standards.
- Act as an escalation point for technical and client service issues as necessary.
- Work with the PMO and delivery team to direct troubleshooting efforts on escalated issues as needed.
Regards,
Bhavani
Indium Software is a niche technology solutions company with deep expertise in Digital , QA and Gaming. Indium helps customers in their Digital Transformation journey through a gamut of solutions that enhance business value.
With over 1000+ associates globally, Indium operates through offices in the US, UK and India
Visit http://www.indiumsoftware.com">www.indiumsoftware.com to know more.
Job Title: Analytics Data Engineer
What will you do:
The Data Engineer must be an expert in SQL development further providing support to the Data and Analytics in database design, data flow and analysis activities. The position of the Data Engineer also plays a key role in the development and deployment of innovative big data platforms for advanced analytics and data processing. The Data Engineer defines and builds the data pipelines that will enable faster, better, data-informed decision-making within the business.
We ask:
Extensive Experience with SQL and strong ability to process and analyse complex data
The candidate should also have an ability to design, build, and maintain the business’s ETL pipeline and data warehouse The candidate will also demonstrate expertise in data modelling and query performance tuning on SQL Server
Proficiency with analytics experience, especially funnel analysis, and have worked on analytical tools like Mixpanel, Amplitude, Thoughtspot, Google Analytics, and similar tools.
Should work on tools and frameworks required for building efficient and scalable data pipelines
Excellent at communicating and articulating ideas and an ability to influence others as well as drive towards a better solution continuously.
Experience working in python, Hive queries, spark, pysaprk, sparkSQL, presto
- Relate Metrics to product
- Programmatic Thinking
- Edge cases
- Good Communication
- Product functionality understanding
Perks & Benefits:
A dynamic, creative & intelligent team they will make you love being at work.
Autonomous and hands-on role to make an impact you will be joining at an exciting time of growth!
Flexible work hours and Attractive pay package and perks
An inclusive work environment that lets you work in the way that works best for you!






