
Job Title: Jewelry Product Designer
Job Type - Full Time
Location – Indore
Summary/Objective:
The Jewelry Product Designer holds a pivotal role in crafting innovative and visually captivating jewelry designs. Responsibilities encompass conceptualization, sketching, and development of jewelry pieces while staying attuned to industry trends and customer preferences. Collaboration with cross-functional teams is integral, ensuring designs seamlessly progress from concept to production.
Responsibilities/Duties:
1. Design Conceptualization:
· Generate imaginative and brand-aligned jewelry design concepts.
· Conduct market research to remain abreast of current trends and customer preferences.
2. Sketching:
· Create detailed sketches of jewelry designs.
3. Material Selection:
· Collaborate with procurement teams to choose suitable materials for each design.
· Ensure that chosen materials align with design concepts and meet quality standards.
4. Collaboration:
· Work closely with cross-functional teams, including production, marketing, and sales.
· Collaborate with craftsmen and artisans to bring designs to fruition.
5. Quality Assurance:
· Maintain a high standard of quality for all designs and finished products.
· Conduct regular quality checks during the production process.
Qualifications/Requirements:
Education:
· Bachelor’s degree in Jewelry Design, Fine Arts, or a related field.
Experience:
· Minimum of 3 years of experience in Jewelry Product Design.
Skills:
· Strong understanding of materials, gemstones, and manufacturing processes.
· Excellent sketching and rendering abilities.
Attention to detail and a keen sense of aesthetics

About Avalon Solution
About
Similar jobs
Data Quality Engineer
Engineering - Hyderabad, Telangana
About Gradera — Digital Twin & Physical AI Platform
At Gradera, we are building a next-generation Digital Twin and Physical AI platform that enables enterprises to model, simulate, and optimize complex real-world systems. Our work brings together strategy, architecture, data, simulation, and experience design to power decision-making across large-scale operational environments such as manufacturing, logistics, and supply chain networks.
This platform-led initiative applies AI-native execution, advanced simulation, and governed orchestration to help organizations test scenarios, predict outcomes, and continuously improve performance. We operate with an enterprise-first mindset prioritizing reliability, transparency, and measurable business impact as we build intelligent systems that scale beyond a single industry or use case.
Data Quality Engineer
Overview
We are seeking a detail-oriented Data Quality Engineer to ensure the integrity, accuracy, and reliability of data powering our digital twin and AI platforms. You will design and implement data quality frameworks, build automated validation pipelines, and establish quality metrics that enable trusted, simulation-ready data products. This role is critical to ensuring that operational decisions and ML models are built on a foundation of high-quality, governed data.
Our core data quality stack includes:
Data Quality Frameworks
- Delta Live Tables expectations for declarative quality enforcement
- Great Expectations for comprehensive data validation
- Databricks data profiling and quality monitoring
Platform & Tools
- Databricks SQL and PySpark for quality checks at scale
- Unity Catalog for lineage tracking and governance compliance
- Python for custom validation logic and anomaly detection
Observability
- Quality metrics dashboards and alerting
- Data profiling and statistical analysis
- Anomaly detection and drift monitoring
Key Responsibilities
- Design and implement data quality frameworks using Delta Live Tables expectations and Great Expectations
- Build automated data validation pipelines that enforce quality standards at ingestion and transformation stages
- Develop data profiling processes to understand data distributions, patterns, and anomalies
- Define and track data quality metrics (completeness, accuracy, consistency, timeliness, validity)
- Implement anomaly detection mechanisms to identify data drift and quality degradation
- Create quality dashboards and alerting systems for proactive issue identification
- Collaborate with data engineers to embed quality checks into ETL/ELT pipelines
- Partner with data architects to establish data quality standards and governance policies
- Investigate and perform root cause analysis for data quality issues
- Document data quality rules, thresholds, and remediation procedures
- Support data certification processes for simulation-ready and ML-ready datasets
- Drive continuous improvement in data quality practices and tooling
Preferred Qualifications
- 6+ years of experience in data engineering or data quality roles, with 3+ years focused on data quality
- Track record of implementing enterprise-scale data quality frameworks
- Experience with Lakehouse architectures (Delta Lake, Iceberg)
- Familiarity with real-time data quality monitoring for streaming pipelines
- Experience working in agile, cross-functional teams
Highly Desirable
- Experience with data quality for digital twin or simulation platforms
- Familiarity with operational state data validation and temporal consistency checks
- Experience with graph data quality validation (Neo4j or similar)
- Exposure to ML data quality (feature validation, training data quality)
- Experience with data observability platforms
- Exposure to industrial domains such as Manufacturing, Logistics, or Transportation is a plus
Location: Hyderabad, Telangana
Department: Engineering
Employment Type: Full-Time
Senior BI Engineer
📍 Location: BHive Bangalore, India
💼 Department: Technology – Data
🚀 Experience Level: Senior
About the Role
Mumzworld is scaling fast, and data is central to our success.
We are looking for a Senior BI Analyst to join our data team and help drive business insights, storytelling, and decision-making through high-quality analytics and reporting.
This is a high-impact role, reporting directly to the Head of Data, and working cross-functionally with business teams, product managers, and engineers to transform data into actionable intelligence.
You will not only deliver robust dashboards and deep-dive analyses but also help define metrics, improve data literacy across the company, and ensure that our analytics layer is consistent, trusted, and scalable.
You will also leverage Generative AI tools to boost productivity, streamline reporting workflows, and enhance collaboration.
Key Responsibilities
- Deliver Impactful Insights
Build, maintain, and optimize dashboards, reports, and deep-dive analyses to support critical business decisions.
- Create Impact Dashboards with Storytelling
Design dashboards that not only show data but tell a clear story — highlighting insights, business implications, and actions.
- Define and Standardize Metrics
Work closely with stakeholders to define KPIs and ensure consistency in metrics across business domains.
- Model for Analytics Needs
Collaborate with data engineers and modelers to ensure data structures (in bronze, silver, gold layers) are optimized for reporting and analytical needs following medallion architecture principles.
- Drive Business Conversations
Translate business questions into clear analytical deliverables; turn data findings into actionable insights and business narratives.
- Ensure Data Quality & Trust
Validate data accuracy and drive initiatives to improve data reliability across all reports and dashboards.
- Leverage Gen AI for Productivity
Promote the use of Generative AI tools to accelerate report building, automate documentation, and summarize insights faster.
- Advance Data-Driven Culture
Conduct workshops, share knowledge, and uplift the organization’s data literacy and self-service capabilities.
- Support Data Science Projects
Partner with the data science team to ensure BI foundations support predictive models, experimentation, and advanced analytics initiatives.
What We’re Looking For
- 5+ years of experience in business intelligence, data analysis, or analytics roles.
- Expertise in BI tools like Looker, Power BI, or Tableau.
- Strong SQL skills to extract, transform, and analyze large datasets.
- Solid understanding of data modeling principles (especially star schema and medallion architecture).
- Understanding of incremental load vs full load strategies, and a passion for building your own scheduled queries and data refreshes.
- Experience working with cloud data warehouses like BigQuery or Snowflake.
- Strong business acumen with the ability to translate data into business value.
- Experience building self-service datasets, semantic layers, or LookML models is a plus.
- Excellent communication skills — able to present complex findings in a clear and actionable way.
Nice-to-Have
📅 E-commerce experience – Familiarity with product, customer, and order data structures.
🤖 Experience with dbt Cloud and modern analytics stacks.
📈 Exposure to data governance, catalogs, or metadata management tools.
🖥️ Experience with GitHub for version control in analytics workflows.
🌐 Understanding of privacy, GDPR, and data compliance best practices.
Who You Are – Soft Skills & Personality
We’re looking for a curious, business-savvy, and detail-driven BI analyst who thrives in a fast-paced, collaborative environment.
👤 Structured & Organized – You design with clarity, document your work well, and maintain consistency across reports and analyses.
💬 Collaborative Communicator – You work well with both technical and business stakeholders.
🚀 Results-Oriented – You focus on delivering impactful insights that drive business outcomes.
🧬 Analytical & Curious – You dig deep into data, question assumptions, and continuously look for improvement opportunities.
🤝 Team Player – You support, teach, and uplift others across the organization.
If you’re passionate about turning data into impact, scaling analytics, and building a data-driven culture, we’d love to hear from you.
Involved in capex,modelling, Budgeting,investment decision making
Should be able to converse well with global stakeholders
O&G, Metals and mining heavy manufacturing Preferred
Others - not a frequent job hopper
Margin optimization -holistic understanding of the O&G P&L (not limited to 1-2 line items only)
Location: All Metros
Requirements: MB Design Experience, Experience in SOA/ Middleware Integration
Our Client is a B2B2C tech Web3 startup founded by founders - IITB Graduates who are experienced in retail, ecommerce and fintech.
Vision: Client aims to change the way that customers, creators, and retail investors interact and transact at brands of all shapes and sizes. Essentially, becoming the Web3 version of brands driven social ecommerce & investment platform. We have two broader development phases to achieve our mission.
candidate will be responsible for automating the deployment of cloud infrastructure and services to
support application development and hosting (architecting, engineering, deploying, and operationally
managing the underlying logical and physical cloud computing infrastructure).
Location: Bangalore
Reporting Manager: VP, Engineering
Job Description:
● Collaborate with teams to build and deliver solutions implementing serverless,
microservice-based, IaaS, PaaS, and containerized architectures in GCP/AWS environments.
● Responsible for deploying highly complex, distributed transaction processing systems.
● Work on continuous improvement of the products through innovation and learning. Someone with
a knack for benchmarking and optimization
● Hiring, developing, and cultivating a high and reliable cloud support team
● Building and operating complex CI/CD pipelines at scale
● Work with GCP Services, Private Service Connect, Cloud Run, Cloud Functions, Pub/Sub, Cloud
Storage, Networking in general
● Collaborate with Product Management and Product Engineering teams to drive excellence in
Google Cloud products and features.
● Ensures efficient data storage and processing functions in accordance with company security
policies and best practices in cloud security.
● Ensuring scaled database setup/montioring with near zero downtime
Key Skills:
● Hands-on software development experience in Python, NodeJS, or Java
● 5+ years of Linux/Unix Administration monitoring, reliability, and security of Linux-based, online,
high-traffic services and Web/eCommerce properties
● 5+ years of production experience in large-scale cloud-based Infrastructure (GCP preferred)
● Strong experience with Log Analysis and Monitoring tools such as CloudWatch, Splunk,
Dynatrace, Nagios, etc.
● Hands-on experience with AWS Cloud – EC2, S3 Buckets, RDS
● Hands-on experience with Infrastructure as a Code (e.g., cloud formation, ARM, Terraform,
Ansible, Chef, Puppet) and Version control tools
● Hands-on experience with configuration management (Chef/Ansible)
● Experience in designing High Availability infrastructure and planning for Disaster Recovery
solutions
● Knowledgeable in networking and security Familiar with GCP services (in Databases, Containers,
Compute, stores, etc) with comfort in GCP serverless technologies
● Exposure to Linkerd, Helm charts, and Ambassador is mandatory
● Experience with Big Data on GCP BigQuery, Pub/Sub, Dataproc, and Dataflow is plus
● Experience with Groovy and writing Jenkinsfile
● Experience with time-series and document databases (e.g. Elasticsearch, InfluxDB, Prometheus)
● Experience with message buses (e.g. Apache Kafka, NATS)
Regards
Team Merito
WIOM will be providing Unlimited High-Speed Internet to 50 million households in the next 7 years. Using its unique Shared Economy approach and an UBER kind of model, WIOM is India’s 1st PDOA under the new PM-WANI scheme. It enables unlimited Internet to people for as low as INR 5 per day. Backed by IIT Delhi and marquee investors, Wiom has a stellar founding team with IIT/ IIM pedigree.
Why should you consider WIOM?
If you want to retire in the next 5 years and create value and impact worth 25 years of hard work, then WIOM is the place. In the next 5 years, we will be creating 1000X growth and a USD 8B dollar enterprise. By joining the team at this stage (we are less than 30 members today), you are taking a ticket to an exclusive Millionaire club that would have impacted the lives of 500 million people. You get an opportunity to work with the sharpest minds in the country along with a fast-paced environment which will bring out the best in you.
What will you be doing?
We are looking for an android developer who will be responsible for the development and maintenance of applications aimed at a vast number of diverse Android devices. Your primary focus will be the development of Android applications and their integration with back-end services. You will be working along-side other engineers and developers working on different layers of the infrastructure. Therefore, commitment to collaborative problem solving, sophisticated design, and creating quality products is essential.
• Translate designs and wireframes into high-quality code
• Design, build, maintain high performance, reusable, and reliable Java code
• Ensure the best possible performance, quality, and responsiveness of the application
• Identify and correct bottlenecks and fix bugs
• Continuously work on improving application performance
What can make you a great fit for this role?
• Excellent Analytical and coding skills
• Strong OO design and hands-on programming skills in Android and java
• Experience of building reactive web applications for mobile using react.js/ vue.js or angular.js
• Android and native apps and SDK development experience
• Work experience with social media and other third-party APIs like payment gateway and third-party libraries
• Worked on the mobile database, good with data structures and algorithms
• Knowledge in Google, Firebase API
1. 3+ years of experience in web development
2. Minimum 2+ years’ experience in Node JS
3. databases-MongoDB
Good to have
- Kubernetes & Dockers and experience with Cloud service APIs (e.g., GCP) are desirable.
- Github CI/CD experience
- Experience in the Real Estate will be a plus
- Experience of using Python/ Django frameworks in building back end for Full Stack Applications
- Hands on experience on Web frameworks, Restful APIs and SOAP
- Strong command on SQL (or relevant query language or ORM queries)
- Knowledge of front-end technologies such as React Native, React.js, Angular.js or Javascript









