
- 6+ years of relevant experience in DB2 LUW Administration
- Good experience with
- Performance tuning and troubleshooting
- High availability solutions HACMP, TSA, MSCS Cluster
- Monitoring, backup, recovery, IBM Data Server Manager, TSM , Commvault
- Data replication, Q-replication & CDC
- Implementing DB2 key features
- Desirable experience with Db2 Pure scale
- Experience with tools i.e. ITM, Nagios, Service Now
- Experience with automation and scripting such as CRON, PowerShell, Shell Scripting
- Experience with configuring and usage of Clustering, db2diag and notification logs, snapshot and event monitor
- Experience with use of problem and change management tools

Similar jobs

Responsibilities:
- Develop and maintain high-quality, efficient, and scalable backend applications.
- Participate in all phases of the software development lifecycle (SDLC)
- Write clean, well-documented, and testable code adhering to best practices.
- Collaborate with team members to ensure the successful delivery of projects.
- Debug and troubleshoot complex technical problems.
- Identify and implement performance optimizations.
- Participate in code reviews
- Hands-on experience with Springboot, Java 8 and above.
- 2-5 years of experience developing Java applications.
- Knowledge about at least one messaging system like Kafka, RabbitMQ etc.
- Required React developer requirements, qualifications & skills:
- Proficiency in React.js and its core principles
- Strong JavaScript, HTML5, and CSS3 skills
- Experience with popular React.js workflows (such as Redux)
- Strong understanding of object-oriented programming (OOP) principles.
- Experience with design patterns and best practices for Java development.
- Proficient in unit testing frameworks (e.g., JUnit).
- Experience with build automation tools (e.g., Maven, Gradle).
- Experience with version control systems (e.g., Git).
- Experience with one of these databases – Postgres, MongoDb, Cassandra
- Knowledge on Retail or OMS is a plus.
- Experienced in containerized deployments using Docker, Kubernetes and DevOps mindset
- Ability to reverse engineer existing/legacy and document findings on confluence.
- Create automated tests for unit, integration, regression, performance, and functional testing, to meet established expectations and acceptance criteria.
- Document APIs using Lowe’s established tooling.
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.


Job Description: Fresher Flutter Developer
Position: Flutter Developer (Fresher)
Location: Pune
Job Type: Full-time
Experience: 0-1 year (Freshers welcome)
Key Responsibilities:
• Develop and maintain mobile applications using Flutter framework.
• Design, build, and implement user-friendly, responsive UI designs that function well across multiple device sizes and platforms.
• Integrate GetX for state management and navigation.
• Implement local database solutions using SQLite for offline data storage and synchronization.
• Collaborate with cross-functional teams to define, design, and ship new features.
• Debug and optimize performance for smooth, efficient mobile apps.
• Follow best practices for mobile development, including writing clean, maintainable, and efficient code.
Required Skills:
• Strong understanding of the Flutter framework and Dart language.
• Familiarity with GetX for state management, routing, and dependency injection.
• Experience working with SQLite for local database management.
• Knowledge of responsive UI design principles to create apps that adapt to various screen sizes.
• Basic understanding of RESTful APIs and integration with backend services.
Preferred Skills:
• Familiarity with version control systems like Git.
• Good problem-solving and debugging skills.
• Knowledge of mobile app lifecycle and architecture patterns (MVC/MVVM).
We are looking for a strategic, driven, and creative marketing mind to join our team and own all of our prospect-facing digital marketing efforts through the buyer’s journey. Working collaboratively with the founders, the digital marketing manager will be responsible for overseeing all online demand generation programs for Sarcon. This includes, but is not limited to, channels and tactics such as SEO, CRO, landing page optimization, SEM (AdWords), paid social, third-party review sites, and more. This position will play a lead role in expanding our presence, regularly optimizing our existing and upcoming channels, and evolving our brand outreach.
What You'll Do:
- Lead digital marketing planning and execution against specific MQL, MQA, pipeline, and revenue goals
- Strategize, execute, and measure all paid search, search ads, social ads, banner and display ads, etc.
- Own the strategy for all on-page SEO, landing page optimization, and conversion rate optimization
- Work collaboratively with other marketing team members and the Sales team on evolving our email nurture strategy and full-funnel prospect outreach
- Create and organize marketing and content calendars, deliverables, and program documentation
- Play a role in evolving our technology stack, discovering new opportunities for us to reach prospective customers
- Work closely with Sales on feedback, collaboration, and education
What We're Looking For:
- Have 5+ years of experience in marketing
- Have proven experience creating, managing, and optimizing marketing programs resulting in new business and positive ROI
- Have a strong history of B2B (preferably SaaS) marketing experience
- Have working knowledge of all Google platforms, including AdWords and Analytics
- Demonstrated ability to shift marketing messaging, tactics, and strategy for small, medium, and enterprise-sized companies
- Have strong analytical and critical thinking skills
- Work collaboratively and respectfully with other team members, regardless of position
- Have strong oral and written communications skills
- Enjoy learning and growing in new environments
- Experience working directly with Sales

Location: Pune/Nagpur/Bangalore/Hyderabad
Exp: 5-9yrs
Should have good experience in Python
Must have experinece in AWS(any technology).
• Develop and manage business relationships with large organizations
• Represent Houm Technology in meetings with these organizations, present our Product concept and explain our Product value proposition
• Act as a single point of contact at Houm Technology for strategically important B2B relationships
• Planning and managing client accounts and coordinating consumer outreach activities
• Building new relationships with multiple B2B relationships
• Create an innovative way to partner with organizations to offer our product to their consumers
Skillsets
• MBA or equivalent degree from premium business schools
• Ideal to have B2B relationship management experience of between 3 to 5 years
• Interpersonal relationship-building skills is key to success
• Articulate with excellent verbal and presentation skills
• Ability to smoothly interact with senior executives of the corporate world


Location – Gurugram (currently work from home)
Job Description –
• Should have worked upon various Python Frameworks Django
• Should have good knowledge of MySql, MongoDB, Hadoop, Big Data, Docker.
• Should have a good understanding of API development in XML and JSON
• Should have worked upon at least one cloud provider such as Google Cloud or AWS
• Should have worked upon Google APIs integration
• Familiarity with SOA (micro-services/message buses/ etc.)
• Should always be ready with new innovative ideas
• Should have the ability to transform the requirements into scalable and smart code
• Responsible for Unit design and coding Responsible for implementing and following standards and guidelines with coding best practices in mind.
• Should be able to work with developers and architects, to ensure bug-free and timely delivery of allocated development tasks.
• Responsible for conducting proper unit testing.
• Should be able to take ownership of a product end-to-end
• Experience working with SAAS, and PAAS would be a plus
Requirements -
• Eligibility BE/ B Tech/ M Tech/ MCA
• Degree in Computer Science or related field
• Critical thinker and problem-solving skills
• Able to contribute individually
• Good time-management skills Great interpersonal
• Education background: BE/ BTech (Computer Sc or Electronics)/ MTech/ MCA with min 70% in their 10th/ 12th/ UG/ PG.