About BookEventz
About
Connect with the team
Similar jobs
• Job Description:
MySQL Database Administrator (DBA) who will be responsible for ensuring the
Performance, Database Tuning, Database Cleanup, Database Migration, Database Availability, and Security of Clusters of MySQL
Instances. The candidate will also be responsible for Orchestrating Upgrades,
Backups, and Provisioning of Database Instances. The candidate will also work in
tandem with the other teams, documentation, and specifications as required.
.
• Key Responsibilities:
✓ Provision MySQL instances, both in Clustered and Non-Clustered Configurations
✓ Ensuring the Performance, Database Tuning, Database Availability, and Security
of Clusters of MySQL Instances
✓ Development of Database Schema
✓ Development of ERD
✓ Find the Root Cause of Database related issues & Database Clean-up
✓ Migrating Data from Customer Database to Serosoft Product Database
✓ Prepare Documentation and Specifications
✓ Handle common database procedures, such as upgrade, backup, recovery,
migration, etc.
✓ Profile server resource usage, optimize and tweak as necessary
✓ Collaborate with other team members and stakeholders
• Skills and Qualifications:
✓ Strong proficiency in MySQL database management
✓ Decent experience with recent versions of MySQL
✓ Understanding of MySQL’s underlying storage engines, such as InnoDB and
MyISAM
✓ Experience with replication configuration in MySQL
✓ Proficient in Database Clean-up and Migrations
✓ Knowledge of De-Facto Standards and Best Practices in MySQL
✓ Proficient in Writing and Optimizing SQL statements
✓ Proficient in Table Scan, Row Scan, Query Execution Plan, SQL Profiler, etc.
✓ Should be able to decide when to create Views (Standard Views, Indexed views,
and Partitioned views)
✓ Knowledge of MySQL features, such as its Event Scheduler
✓ Ability to plan Resource requirements from High-Level Specifications
✓ Familiarity with other SQL/NoSQL databases such as PostgreSQL, MongoDB, etc.
✓ Knowledge of limitations in MySQL and their workarounds in contrast to other
popular relational databases
✓ Proficient understanding of code versioning tools such as SVN / Git
✓ DBA Certification would be added advantage
Job Description
We are looking for a Customer Integration Engineer to help new and existing customers with all aspects of implementing our solution. You will work closely with our sales team to get onboard our new clients.
Responsibilities
-
You will be the main point of contact for the new customer’s technical and implementation teams to get new customers live or existing customers live with new projects
-
You will guide the client’s tech team through the entire integration process
-
Ensure continuity between sales and customer success
Skills & Requirements
-
1 to 3 years of experience in customer integration with SaaS companies
-
Experience with frontend web development is must
-
Experience and understanding of REST APIs
-
Exposure to mobile app technologies is a plus
-
SaaS and B2B experience is a must
-
Comfortable with communicating with all levels of team members internally and
externally
Join us
-
To work on one of the few world class DevTools SaaS companies in India
-
To work with a global team which delivers 1 Bn optimized media files each day.
(Google answers 6Bn queries / day)
-
For the opportunity to work in a dynamic team that is async first and remote
-
For an open learning environment with a lot of freedom
-
For the leadership opportunities in a very transparent, result oriented and fast
growing team
What we are looking for:
We are looking for someone who has a proven record in SEO performance for
dynamic content on large scale. (Something like news, location-based listing)
Have seen organic traffic of multi-millions from SEO.
Can take ownership of the complex project.
We are seeking an experienced Chief Technology Officer with a passion for esports to join our team. The ideal candidate will have a strong track record of leading and managing technical teams and a proven ability to drive technology strategy and innovation. Experience in the esports industry is a plus.
Responsibilities:
- Lead the development and execution of our technology roadmap, aligning with business goals and objectives
- Oversee the design, development, and maintenance of our technology infrastructure and systems
- Manage and mentor the technology team, including setting goals and providing direction and support
- Collaborate with other departments to identify and prioritize technical needs and opportunities
- Monitor and analyze industry and market trends, and assess their potential impact on the company
- Communicate technical plans and progress to senior management and stakeholders
Requirements:
- Strong professional experience as a CTO or similar leadership role
- Proven ability to lead and manage technical teams
- Strong understanding of software development and delivery processes
- Experience with driving technology strategy and innovation
- Excellent communication and collaboration skills
Preferred:
- Experience in the esports industry
- Experience with cloud technologies such as AWS or Azure
- Experience with agile development methodologies
- Familiarity with a variety of programming languages and technologies
- Experience with budget and resource management
With the advent of remote workplaces, it is imperative to migrate the learning and development function to work remotely and our client is a SAAS product that helps companies to train their employees using digital tools wherever they are. They offer a combination of a cloud based learning management system along with a vast library of learning content that helps businesses deliver on the main pillars of talent development which include training, engagement, automation and measurement and thus supporting them in growth.
Founded in 2012, our client is headquartered in Mumbai and is founded by a graduate from Boston University. They have raised a total of 30M in funding over 1 round. So far in their journey, they have, approx., 94% customer satisfaction, 1 Lakh+ activities and learning assets and their clientele includes Amazon, Raymond, CARATLANE, etc.
As a Enterprise Salesperson, you will be responsible for prospecting relentlessly to build pipeline and building strong personal relationships with prospects.
What you will do:
- Hiring, training and managing your own sales and business development team
- Delivering sales pitch, demo, presentation and proposal to potential clients
- Creating reliable forecasts and be completely transparent with management on the pipeline status
- Closing new business consistently at or above quota level
- Meeting senior executives and CXOs and building your own network
- Providing the company's training to clients
- Following up regularly with existing clients to ensure they are happy
- Creating case-studies of client success stories
- Building global channel partners
- Investing in colleagues and giving coaching and advice when you see an opportunity for improvement
- Listening to the needs of the market and sharing insights with product and marketing teams
- Making the company the default choice for any enterprise evaluating a learning, engagement and performance automation solution
Desired Candidate Profile
What you need to have:- Experience in selling IT or SaaS solutions
- Strong understanding of the sales funnel
- Comfort working with numbers
- Ability to analyze, calculate and reason a situation effectively
- A sharp eye for detail
- Superior presentation skills
- Ability ot speak and write clearly and correctly
- Ability to manage a large number of prospect situations simultaneously
- Ability to position company products against direct and indirect competitors
- Willingness to travel
- Experience with LMS, B2B, SaaS, IT, cloud, enterprise software, e-learning companies preferred
Title: Data Engineer (Azure) (Location: Gurgaon/Hyderabad)
Salary: Competitive as per Industry Standard
We are expanding our Data Engineering Team and hiring passionate professionals with extensive
knowledge and experience in building and managing large enterprise data and analytics platforms. We
are looking for creative individuals with strong programming skills, who can understand complex
business and architectural problems and develop solutions. The individual will work closely with the rest
of our data engineering and data science team in implementing and managing Scalable Smart Data
Lakes, Data Ingestion Platforms, Machine Learning and NLP based Analytics Platforms, Hyper-Scale
Processing Clusters, Data Mining and Search Engines.
What You’ll Need:
- 3+ years of industry experience in creating and managing end-to-end Data Solutions, Optimal
Data Processing Pipelines and Architecture dealing with large volume, big data sets of varied
data types.
- Proficiency in Python, Linux and shell scripting.
- Strong knowledge of working with PySpark dataframes, Pandas dataframes for writing efficient pre-processing and other data manipulation tasks.
● Strong experience in developing the infrastructure required for data ingestion, optimal
extraction, transformation, and loading of data from a wide variety of data sources using tools like Azure Data Factory, Azure Databricks (or Jupyter notebooks/ Google Colab) (or other similiar tools).
- Working knowledge of github or other version control tools.
- Experience with creating Restful web services and API platforms.
- Work with data science and infrastructure team members to implement practical machine
learning solutions and pipelines in production.
- Experience with cloud providers like Azure/AWS/GCP.
- Experience with SQL and NoSQL databases. MySQL/ Azure Cosmosdb / Hbase/MongoDB/ Elasticsearch etc.
- Experience with stream-processing systems: Spark-Streaming, Kafka etc and working experience with event driven architectures.
- Strong analytic skills related to working with unstructured datasets.
Good to have (to filter or prioritize candidates)
- Experience with testing libraries such as pytest for writing unit-tests for the developed code.
- Knowledge of Machine Learning algorithms and libraries would be good to have,
implementation experience would be an added advantage.
- Knowledge and experience of Datalake, Dockers and Kubernetes would be good to have.
- Knowledge of Azure functions , Elastic search etc will be good to have.
- Having experience with model versioning (mlflow) and data versioning will be beneficial
- Having experience with microservices libraries or with python libraries such as flask for hosting ml services and models would be great.
Summary
Our Kafka developer has a combination of technical skills, communication skills and business knowledge. The developer should be able to work on multiple medium to large projects. The successful candidate will have excellent technical skills of Apache/Confluent Kafka, Enterprise Data WareHouse preferable GCP BigQuery or any equivalent Cloud EDW and also will be able to take oral and written business requirements and develop efficient code to meet set deliverables.
Must Have Skills
- Participate in the development, enhancement and maintenance of data applications both as an individual contributor and as a lead.
- Leading in the identification, isolation, resolution and communication of problems within the production environment.
- Leading developer and applying technical skills Apache/Confluent Kafka (Preferred) AWS Kinesis (Optional), Cloud Enterprise Data Warehouse Google BigQuery (Preferred) or AWS RedShift or SnowFlakes (Optional)
- Design recommending best approach suited for data movement from different sources to Cloud EDW using Apache/Confluent Kafka
- Performs independent functional and technical analysis for major projects supporting several corporate initiatives.
- Communicate and Work with IT partners and user community with various levels from Sr Management to detailed developer to business SME for project definition .
- Works on multiple platforms and multiple projects concurrently.
- Performs code and unit testing for complex scope modules, and projects
- Provide expertise and hands on experience working on Kafka connect using schema registry in a very high volume environment (~900 Million messages)
- Provide expertise in Kafka brokers, zookeepers, KSQL, KStream and Kafka Control center.
- Provide expertise and hands on experience working on AvroConverters, JsonConverters, and StringConverters.
- Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector, JMS source connectors, Tasks, Workers, converters, Transforms.
- Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
- Working knowledge on Kafka Rest proxy.
- Ensure optimum performance, high availability and stability of solutions.
- Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.
- Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. Leverage Hadoop ecosystem knowledge to design, and develop capabilities to deliver our solutions using Spark, Scala, Python, Hive, Kafka and other things in the Hadoop ecosystem.
- Use automation tools like provisioning using Jenkins, Udeploy or relevant technologies
- Ability to perform data related benchmarking, performance analysis and tuning.
- Strong skills in In-memory applications, Database Design, Data Integration.
Project Delivery, Project Management, Resource allocation, Project Planning, Project Mentioning, Team Management, Vendor Management, Risk Management, Project oversight / coordination / management
Looking for a proficient Backend Engineer - JAVA for a leading E-Commerce Company.
- Proficient in Java, with a good knowledge of its ecosystems
- Great OO skills, including strong knowledge in design and architectural patterns
- Skill for writing reusable Java libraries
- Experience with Play Framework for Java
- Experience with JavaScript & frameworks like AngularJS
- Knowledge of concurrency patterns in Java
- Familiarity with concepts of MVC, JDBC, and RESTful
- Basic understanding of JVM, its limitations, weaknesses, and workarounds
- Implementing automated testing platforms and unit tests
- Working knowledge of NoSQL (preferably MongoDB)
- Proficient understanding of versioning tools, such as Git
- Desire to contribute to the wider technical community through collaboration
- Ability to quickly grasp any new technologies
- Strong communication and collaboration skills