
Similar jobs
Key Responsibilities
- Flow Development & Automation
- Develop, maintain, and enhance CAD automation scripts and flows for physical design (place-and-route, timing closure, physical verification, etc.).
- Integrate and validate EDA tools for synthesis, floorplanning, clock tree synthesis, routing, and sign-off.
- EDA Tool Support
- Work closely with design teams to debug and resolve CAD/EDA tool issues.
- Collaborate with EDA vendors for tool evaluations, feature requests, and bug fixes.
- Physical Verification & Sign-Off
- Build and maintain flows for DRC, LVS, ERC, IR drop, EM, and timing sign-off.
- Ensure physical design flows meet foundry requirements and tapeout schedules.
- Methodology Development
- Develop best practices and guidelines for efficient design closure.
- Evaluate new EDA technologies and propose improvements to existing workflows.
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.
Job Description for Digital Marketing Executive
About Quantum:
We are a product innovation company that supports the Transportation and Shipping industries to overcome their business challenges and be leaders in Digital Transformation. Our strength is the in-depth knowledge we hold of the Transportation and Shipping landscape. Our journey begun when we started simplifying the complexities of these businesses as far back as 2003. At the core, we aspire to impact great changes in the Transportation business through our inventions & technology. We are based in Bangalore.
Job role:
· Job role: Digital Marketing Executive
· Educational qualification: BBA/BBM/MBA
· Age limit: 28 to 32 years
· Job location: Koramangala, Bangalore. We are doing work from office model
· Years of experience: 3 to 6 years
What you will be doing:
· Responsible for formulation and execution of the company’s Digital Marketing Strategy
· Creation and management of website
· Independently managing all digital marketing activities
· Set up and management of Linked in and twitter accounts
· Writing blogs and content on digital forum to ensure visibility
· Responsible to develop and implement the Online Marketing and Brand Positioning strategies
· Create key reporting and performance indicators based on business requirements
· Brand management on the digital forum
Core skill required:
· Social Media Management
· Search Engine Optimization
· Social Media Calendar Management
· Brand Reputation Management
· Google Funnel Creation and Optimization
· Email Marketing
· Programmatic Advertising
· Google AdWords
· Meta Advertising
· LinkedIn Advertising
· Brand Management
· Off page SEO & On-Page SEO Analysis, Technical SEO
· Google Analytics
· Advanced Web Page Ranking
· Google Webmasters
· Google Search Console
· Google Trends
· Blog Management/Content Generation
Primary skills:
· 3+ years of experience as a Digital Marketing Professional
· Understanding the importance of the Website
· Experience in Email marketing and carry it to the digital forum
· Social media marketing - visibility
· Experience in branding
· Must understand page optimization and off page optimization
· Understand the value of Search engine optimization but as a complementary skill.
· Blog writing and Content generation
Secondary skills:
· Knowledge on Google certification and Google analytics, Google AdWords
· Setting up tracking systems for marketing campaigns and online activities
· Experience in Website maintenance and press releases
· Latest marketing techniques
Get in Touch or apply?
At Quantum, as much as our clients are important to us, our internal team is very valuable to us, if you invest in us, we will invest back in you through learning, recognition & an opportunity to grow. Be a part of the team that innovates for the future.
Our Social Media presence:
Website: www.quantumbso.com
LinkedIn: https://www.linkedin.com/company/quantumbso/
Company location:
Quantum BSO and Tech Pvt. Ltd, 3rd Floor, Ahad Pinnacle, #80, 5th Main, 2nd Cross, 5th Block,, Koramangala Industrial Area, Bengaluru, Karnataka 560095.
Position - Recruiter & HR
About the role
Identify qualified Tech & Non-tech candidate profiles using various sourcing techniques
Proactively interact with potential candidates through various job portals, social media, and professional networks (e.g. LinkedIn, Indeed, Angle list)
Work directly with the founders and develop talent pipelines for future hiring needs and strategic roles
Ensuring a good candidate experience during the recruitment process
As our first HR hire, be responsible for some of the other early stage HR functions (Onboarding & Policies).
What are we looking for?
3-5+ years working in the early stage startups or similar organizations
You are a self-starter and can make decisions on your feet with minimal supervision, end-to-end ownership to drive outcomes.
You possess deep employee empathy
We are location agnostic, but in the same time zone (India). We have a small garden office @ Jayanagar, Bangalore (home to some of the best dosas in India) to jam together on anything that requires space.
You are somebody who shares the same ethos.
Bonus
You've an interest in EdTech & learning.
You have interests outside work, in art, music, cooking, dance or any other similar skill.
What is our interview process?
An initial 30 min call to understand each other at a high level.
Work on a case study or assignment together to ensure a good match.
Job Sector: IT, Software
Job Type: Permanent
Location: Chennai
Experience: 10 - 20 Years
Salary: 12 – 40 LPA
Education: Any Graduate
Notice Period: Immediate
Key Skills: Python, Spark, AWS, SQL, PySpark
Contact at triple eight two zero nine four two double seven
Job Description:
Requirements
- Minimum 12 years experience
- In depth understanding and knowledge on distributed computing with spark.
- Deep understanding of Spark Architecture and internals
- Proven experience in data ingestion, data integration and data analytics with spark, preferably PySpark.
- Expertise in ETL processes, data warehousing and data lakes.
- Hands on with python for Big data and analytics.
- Hands on in agile scrum model is an added advantage.
- Knowledge on CI/CD and orchestration tools is desirable.
- AWS S3, Redshift, Lambda knowledge is preferred
1. Research and identify institutional, healthcare, and/or hospitality projects that have a requirement of architectural design/redesign
2. Create a database of such project leads and clients
3. Coordinate and schedule meetings with prospective clients
4. Follow up with the leads
Additional information -
1. Prior experience in conducting market research and sourcing projects will be preferable
2. Prior knowledge of the Software Development field is a plus

- Should have expert level knowledge to Design, Develop and test Web Apps with Angular versions 6 and above.
- Primary Skills – Angular6+, HTML, CSS3, JavaScript.
- Node JS as backend will be advantage
- Knowledge of Mongo, MySql will be advantage
- Highly skilled at front-end engineering using Object-Oriented JavaScript, TypeScript, various JavaScript,TypeScript patterns and frameworks.
- Should have experience in publishing and consuming Services using REST API
- Excellent time-management, multi-tasking, and communication skills. Capable of handling multiple projects and related complexities at the same time.

Responsibilities
- Installing and configuring Informatica components, including high availability; managing server activations and de-activations for all environments; ensuring that all systems and procedures adhere to organizational best practices
- Day to day administration of the Informatica Suite of services (PowerCenter, IDS, Metadata, Glossary and Analyst).
- Informatica capacity planning and on-going monitoring (e.g. CPU, Memory, etc.) to proactively increase capacity as needed.
- Manage backup and security of Data Integration Infrastructure.
- Design, develop, and maintain all data warehouse, data marts, and ETL functions for the organization as a part of an infrastructure team.
- Consult with users, management, vendors, and technicians to assess computing needs and system requirements.
- Develop and interpret organizational goals, policies, and procedures.
- Evaluate the organization's technology use and needs and recommend improvements, such as software upgrades.
- Prepare and review operational reports or project progress reports.
- Assist in the daily operations of the Architecture Team , analyzing workflow, establishing priorities, developing standards, and setting deadlines.
- Work with vendors to manage support SLA’s and influence vendor product roadmap
- Provide leadership and guidance in technical meetings, define standards and assist/provide status updates
- Work with cross functional operations teams such as systems, storage and network to design technology stacks.
Preferred Qualifications
- Minimum 6+ years’ experience as Informatica Engineer and Developer role
- Minimum of 5+ years’ experience in an ETL environment as a developer.
- Minimum of 5+ years of experience in SQL coding and understanding of databases
- Proficiency in Python
- Proficiency in command line troubleshooting
- Proficiency in writing code in Perl/Shell scripting languages
- Understanding of Java and concepts of Object-oriented programming
- Good understanding of systems, networking, and storage
- Strong knowledge of scalability and high availability



