
Job Description for Database Consultant-I (PostgreSQL)
Job Title: Database Consultant-I (PostgreSQL)
Company: Mydbops
About Us:
Mydbops is a trusted leader with 8+ years of excellence in open-source database management. We deliver best-in-class services across MySQL, MariaDB, MongoDB, PostgreSQL, TiDB, Cassandra, and more. Our focus is on building scalable, secure, and high-performance database solutions for global clients. As a PCI DSS-certified and ISO-certified organisation, we are committed to operational excellence and data security.
Role Overview:
As a Database Consultant – I (PostgreSQL Team), you will take ownership of PostgreSQL database environments, offering expert-level support to our clients. This role involves proactive monitoring, performance tuning, troubleshooting, high availability setup, and guiding junior team members. You will play a key role in customer-facing technical delivery, solution design, and implementation.
Key Responsibilities:
- Manage PostgreSQL production environments for performance, stability, and scalability.
- Handle complex troubleshooting, performance analysis, and query optimisation.
- Implement backup strategies, recovery solutions, replication, and failover techniques.
- Set up and manage high availability architectures (Streaming Replication, Patroni, etc.).
- Work with DevOps/cloud teams for deployment and automation.
- Support upgrades, patching, and migration projects across environments.
- Use monitoring tools to proactively detect and resolve issues.
- Mentor junior engineers and guide troubleshooting efforts.
- Interact with clients to understand requirements and deliver solutions.
Requirements:
- 3–5 years of hands-on experience in PostgreSQL database administration.
- Strong Linux OS knowledge and scripting skills (Bash/Python).
- Proficiency in SQL tuning, performance diagnostics, and explain plans.
- Experience with tools like pgBackRest, Barman for backup and recovery.
- Familiarity with high availability, failover, replication, and clustering.
- Good understanding of AWS RDS, Aurora PostgreSQL, and GCP Cloud SQL.
- Experience with monitoring tools like pg_stat_statements, PMM, Nagios, or custom dashboards.
- Knowledge of automation/configuration tools like Ansible or Terraform is a plus.
- Strong communication and problem-solving skills.
Preferred Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or equivalent.
- PostgreSQL certification (EDB/Cloud certifications preferred).
- Past experience in a consulting, customer support, or managed services role.
- Exposure to multi-cloud environments and database-as-a-service platforms.
- Prior experience with database migrations or modernisation projects.
Why Join Us:
- Opportunity to work in a dynamic and growing industry.
- Learning and development opportunities to enhance your career.
- A collaborative work environment with a supportive team.
Job Details:
- Job Type: Full-time
- Work Days: 5 Days
- Work Mode: Work From Home
- Experience Required: 3–5 years

About MyDBOPS
About
Similar jobs

About Company:
Gevme is a Singapore based fast growing leading virtual & hybrid event and engagement platform for building unique experiences. It is used by event professionals worldwide to build, operate and monetise virtual events for some of the biggest brands. The flexibility of the platform provides them with limitless possibilities to turn any virtual event idea into reality. We have already powered hundreds of thousands of events around the world for clients like Facebook, Netflix, Starbucks, Forbes, MasterCard, Singapore Government.
We are a product company with a strong engineering and family culture; we are always looking for new ways to enhance the event experience and empower efficient event management. We’re on a mission to groom the next generation of event technology thought leaders as we grow.
Join us if you want to become part of a vibrant and fast moving product company that's on a mission to connect people around the world through events.
Please check out our platform Gevme
We are on the lookout for a Customer Support Representative, who will be the face of our company in terms of making advocates out of our end-users who are using our platform to organise their events.
Location: Remote/Work from Home
What winning in this role looks like:
- Strengthen client relationship by being the go-to-person for client challenges relating to the platform
- Manage customer service inquiries and technical issues through clear communication to keep track of support tickets and status
- Train and guide customers with the on-boarding process on Gevme
- Acts as the "Voice of the Customer" by providing feedback to the development team on customer pain points
- Establish great relationships with internal stakeholders (Project, Product, Sales etc.) to achieve customers' goals
- Remain positive in challenging scenarios and inspire internal partners to do great work
- Assist Professional Services Team with ongoing customer deliverables if assigned
- Act as Coordinator for long-term DIY users for specific ad-hoc requirements
- Maintain the Support Portal with articles that will help with the client on-boarding process
- Post periodic best practices articles to help provide useful tips to our clients and users
You should:
- Have a technical background with knowledge of HTML / CSS / JS / jQuery and ideally Reactjs
- Excellent communication skills, detail-oriented, and strong understanding of client requests
- Possess a can-do attitude who loves to interact with clients both face-to-face as well as online
- Have a general understanding of the online space, including Software-as-a-Service (SaaS)
- Love to read up on anything online, be it social media, technological trends or the latest marketing techniques.
Requirements:
- Diploma/ Degree in IT or equivalent in computer sciences
- Strong communication skills, who excel in managing clients face-to-face or online
- Past experience in front-facing function such as Account Management, or Customer Support roles
- Minimum 1-2 years of experience in Customer Support-related roles
- Independent, highly-motivated and results-driven, able to thrive under pressure while taking pride in customer delight
- Excellent interpersonal skills with positive outlook
- Experience with any other event management and support tools a huge plus point
- HTML / CSS / JS / jQuery + ideally Reactjs (bonus)
Dhwani is looking for an experienced (5-10 years) MySQL database
administrator who will be responsible for ensuring the performance,
availability, and security of clusters of MySQL instances. The person will also be responsible for orchestrating upgrades, backups, and provisioning database instances. He/She will also work in tandem with the other teams, preparing documentation and specifications as required.
Job Responsibilities
1. Provision MySQL instances, both in clustered and non-clustered
configurations
2. Ensure performance, security, and availability of databases
3. Prepare documentation and specifications
4. Handle common database procedures, such as upgrade, backup, recovery, migration, etc.
5. Profile server resource usage, optimize and tweak as necessary
6. Collaborate with other team members and stakeholders
7. Writing queries and generating a report
Required Skills
1. Strong proficiency in MySQL database management, decent experience with recent versions of MySQL.
2. Understanding of MySQL’s underlying storage engines, such as InnoDB and MyISAM
3. Experience with replication configuration in MySQL
4. Knowledge of de-facto standards and best practices in MySQL
5. Proficient in writing and optimizing SQL statements
6. Knowledge of MySQL features, such as its event scheduler
7. Ability to plan resource requirements from high-level specifications
8. Familiarity with other SQL/NoSQL databases along with monitoring
tools. maxscale and Proxy SQL.
9. Working in a Linux environment is a must.
10. Knowledge of Docker is an advantage.
11. Knowledge of limitations in MySQL and their workarounds in contrast to other popular relational databases.
12. Should hand on writing complex queries and generating reports as per requirement
13. Experience in handling multi-location databases
Education
Bachelor’s degree in an analytical related field, including information
technology, science, and engineering discipline.
6 days working -Remote working would do
JOB DES RIPTION
Founded by experienced founders and funded by Tier-1 VCs, It's a solution for democratizing the shopping experience on e-commerce platforms. Our aim is to provide a superior shopping experience for all our partners and improve both customer satisfaction and their GMV.Being an early-stage company, we are looking for self-driven, motivated people who want to build something exciting and are always looking out for the next big thing. We plan to build this company remotely, which brings freedom but also an added sense of responsibility. If all this sounds interesting to you read on
Responsibilities
- Writing testable and efficient code
- Design and implementation of low-latency, high-availability, and performant applications
- Implementation of security and data protection
- implementing business logic and developing APIs and services
- Build reusable code and libraries for future use.
Skills And Qualifications
- 2-3 years of hands-on experience in back-end development with Node.js.
- Knowledge of Node.js frameworks such Resitfy
- Good understanding of server-side templating languages
- Basic understanding of front-end technologies, such as HTML5, and CSS3
- Expertise with Linux based systems
- Proficient understanding of code versioning tools, such as Git
- Have worked in any of the cloud based platform AWS, GCP, Docker, Kubernetes.
Traits we value
- Independent, resourceful, analytical, and able to solve problems effectively
- Ability to be flexible, agile, and thrive in chaos
- Excellent oral and written communication skills
The Merck Data Engineering Team is responsible for designing, developing, testing, and supporting automated end-to-end data pipelines and applications on Merck’s data management and global analytics platform (Palantir Foundry, Hadoop, AWS and other components).
The Foundry platform comprises multiple different technology stacks, which are hosted on Amazon Web Services (AWS) infrastructure or on-premise Merck’s own data centers. Developing pipelines and applications on Foundry requires:
• Proficiency in SQL / Java / Python (Python required; all 3 not necessary)
• Proficiency in PySpark for distributed computation
• Familiarity with Postgres and ElasticSearch
• Familiarity with HTML, CSS, and JavaScript and basic design/visual competency
• Familiarity with common databases (e.g. JDBC, mySQL, Microsoft SQL). Not all types required
This position will be project based and may work across multiple smaller projects or a single large project utilizing an agile project methodology.
Roles & Responsibilities:
• Develop data pipelines by ingesting various data sources – structured and un-structured – into Palantir Foundry
• Participate in end to end project lifecycle, from requirements analysis to go-live and operations of an application
• Acts as business analyst for developing requirements for Foundry pipelines
• Review code developed by other data engineers and check against platform-specific standards, cross-cutting concerns, coding and configuration standards and functional specification of the pipeline
• Document technical work in a professional and transparent way. Create high quality technical documentation
• Work out the best possible balance between technical feasibility and business requirements (the latter can be quite strict)
• Deploy applications on Foundry platform infrastructure with clearly defined checks
• Implementation of changes and bug fixes via Merck's change management framework and according to system engineering practices (additional training will be provided)
• DevOps project setup following Agile principles (e.g. Scrum)
• Besides working on projects, act as third level support for critical applications; analyze and resolve complex incidents/problems. Debug problems across a full stack of Foundry and code based on Python, Pyspark, and Java
• Work closely with business users, data scientists/analysts to design physical data models
The ROLE
- Handle the inbound interest received by Marquee
- Get the inbound leads to setup an online demo with us
- Conduct a small research on the prospect’s company and background prior to the demo to conduct a smooth sales call
- Conduct online demos with the prospects and turn them into paying customers
- Ensure that the product demo is neat and organized
- Work with the prospects to understand their needs and pitch them the right Marquee product to solve their problems
- Manage on-going relationships with customers
- Run strategic follow ups (Calls/texts/emails) on the prospects to get in maximum sales
- Strive to achieve the sales targets
- Work with our Founder on strategic alliances - potentially travelling to various financial centers globally
Here are some important things to consider prior to applying to us:
- Knowledge/prior experience with startup fund raising is a distinct advantage
- WE'RE A REMOTE TEAM so you can work from wherever you like - you don't have to attend office on a day to day basis
- You should own a laptop
- EXCELLENT written & spoken English required
- We shall provide a 5-7 days training session once you join in

What you will do:
- Creating and updating proprietary models/spreadsheets for the prospective investments
- Analysing financial information and conducting analytic and strategic research.
- Inputting data into proprietary financial models and ensuring the accuracy of data and output based on the data.
- Creating Automations using VBA and other tools wherever needed.
- Compiling historical data in respect of stocks and companies from publicly available sources.
- Updating and maintaining databases to track relevant financial, economic or other indicators which may be relevant to the sector and/or region under coverage
- Assisting with other company and industry related analysis as may be required by the Fund Manager
- Monitoring relevant market news and information
- Assisting with the preparation and development of research reports, industry primers and marketing presentations.
- Financial Modelling Experience is a must and a person should be excellent at this. This is the main part of the job along with research on the Financial Numbers
- Excellent understanding of Financial Statements and Accounting Standards.
- Qualified CA
- Financial Statements Audit experience preferred
- The ability to work independently and proactively
- Person should be passionate for Equities
- Strong proficiency in Advanced Excel, VBA.
- Proficiency in data science tools and languages (SQL and Python) will be considered as a great positive but not a necessary requirement



Job Description
We are looking for a data scientist that will help us to discover the information hidden in vast amounts of data, and help us make smarter decisions to deliver even better products. Your primary focus will be in applying data mining techniques, doing statistical analysis, and building high quality prediction systems integrated with our products.
Responsibilities
- Selecting features, building and optimizing classifiers using machine learning techniques
- Data mining using state-of-the-art methods
- Extending company’s data with third party sources of information when needed
- Enhancing data collection procedures to include information that is relevant for building analytic systems
- Processing, cleansing, and verifying the integrity of data used for analysis
- Doing ad-hoc analysis and presenting results in a clear manner
- Creating automated anomaly detection systems and constant tracking of its performance
Skills and Qualifications
- Excellent understanding of machine learning techniques and algorithms, such as Linear regression, SVM, Decision Forests, LSTM, CNN etc.
- Experience with Deep Learning preferred.
- Experience with common data science toolkits, such as R, NumPy, MatLab, etc. Excellence in at least one of these is highly desirable
- Great communication skills
- Proficiency in using query languages such as SQL, Hive, Pig
- Good applied statistics skills, such as statistical testing, regression, etc.
- Good scripting and programming skills
- Data-oriented personality

