Job description
The role encompasses administration of and responsible for MongoDB database and will be responsible for ensuring the database performance, high availability, and security of clusters in MongoDB instances.
- The candidate will be responsible for ensuring that database management policies, processes and procedures are followed, adhering to ITIL good practice principles and are subjected to continuous improvement as per PCI standards.
- He / She will be responsible for reviewing system design changes to ensure they adhere to expected service standards and recommend changes to ensure maximum stability, availability and efficiency of the supported applications.
- The candidate should understand the application functionality, business logic and work with application stakeholders to understand the requirement and discuss the new application features and propose the right solutions.
What you'll do
- Install, deploy and manage MongoDB on physical and virtual machines
- Create, configure and monitor large-scale, secure, MongoDB sharded clusters
- Support MongoDB in a high availability, multi-datacenter environment
- Administer MongoDB Ops Manager monitoring, backups and automation
- Configure and monitor numerous MongoDB instances and replica sets
- Automate routine tasks with your own scripts and open-source tools
- Improve database backups and test recoverability regularly
- Study the database needs of our applications and optimize them using MongoDB
- Maintain database performance and capacity planning
- Write documentation and collaborate with technical peers online
- All database administration tasks like backup, restore, SQL optimizations, provisioning infrastructure, setting up graphing, monitoring and alerting tools, replication
- Performance tuning for high throughput
- Architecting high availability servers
What qualifications will you need to be successful?
Skills and Qualifications
- Minimum 1 years of experience in MongoDB technologies, Total should be 3 years in database administration.
- Install, Deploy and Manage MongoDB on Physical, Virtual, AWS EC2 instances
- Should have experience on MongoDB Active Active sharded cluster setup with high availability
- Should have experience on administrating MongoDB on Linux platform
- Experience on MongoDB version upgrade, preferably from version 4.0 to 4.4, on production environment with a zero or very minimum application down time, either with ops manager or custom script
- Experience on building the database monitoring using tools like, AppD, ELK, Grafana etc.
- Experience in Database performance tuning which include both script tuning and hardware configuration and capacity planning.
- Good Understanding and experience with Mongodb sharding and Disaster Recovery plan
- Design and implement the backup strategy and BCP process across the MongoDB environments. Maintain the uniform backup strategy across the platform
- Define the database monitoring, monitoring thresholds, alerts, validate the notifications and maintain the documents for the future references
- Database performance tuning based on the application requirement and maintain the stable environment. Analyse the existing mongodb queries behalf of the performance improvement program
- Work with engineering team to understand the database requirement and guide them the best practice and optimize the queries to get the better performance
- Work with application stake holders to understand the production requirement and propose the effective database solutions
- Review and understand the ongoing business reports and create new adhoc reports based on the requirement

About Techugo
About
Similar jobs
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Please find the below JD Cisco Engineer(N+C)+Cisco Engg (R&S)+SdWan
Up to 10 years of hands-on experience in managing LAN, WAN, SDWAN, DC Networking, WiFi.
• Strong understanding of TCP/IP, routing protocols, L2/L3 switches, Wi-Fi 802.11 and SD-WAN networks experience.
• Experience in Implementation and Troubleshooting of Wireless controller, Access Points, 802.1x, Wi-Fi protocols (802.11a/b/g/n/ae/ac)
• Good understanding of VLAN, VTP, DTP, 802.1Q trunk, STP, MSTP, ACL, SNMP config.
• Experience of working on Routers and Switches (Cisco ASR, ISR , Catalyst, Nexus )
• Experience in configuring and troubleshooting BGP, OSPF, IS-IS, EIGRP and static routes.
• Experience in configuring and troubleshooting STP, MSTP, RSTP, VSTP, HSRP, DHCP.
• Should have experience of configuration and management of Cisco viptela SDWAN solution.
• Prior experience in Implementation and migration projects involving above-mentioned technologies is mandatory.
• Strong troubleshooting and problem-solving skills.
• Positive, communicative, and customer-oriented attitude.
● Creating RESTful API with Node.js
● Collaborating with front-end developers on the integration of elements.
● Implementing effective security protocols, data protection measures, and
storage solutions.
● Maintaining all the required documents for your project.
● Constantly coming up with new ideas and also implementing them to
improve the app’s performance.
● Define and communicate technical and design requirements.
● Learn about new technologies and stay up to date with current best
practices.
● Create Unit and Integration tests to ensure the quality of code
Requirements
● Knowledge of the database and familiarity with the schema design in
NoSQL (i.e MongoDB)
● Knowledge of Relational databases like MySQL will be preferred.
● A good understanding of the Software Development Lifecycle
● Knowledge of API design and development using REST
● Knowledge of version control systems like Git.
● Good understanding of object-oriented programming(OOP) and OOP
patterns.
● Again, You don’t have to know it all in-depth but you should know how to
dig the internet for finding the solutions.

Job description
Job title: Sr. React Js Developer
Job Description:
- 3+ years of extensive experience in developing the Single Page Application with Web
responsive design for mobile and web using React JS, HTML 5, CSS 3 and Bootstrap. - Strong proficiency in JavaScript, including DOM manipulation and the JavaScript object
model. - Experience with popular React.js workflows (such as Flux or Redux)
- Familiarity with RESTful APIs
- Knowledge of modern authorization mechanisms, such as JSON Web Token
- Thorough understanding of React.js and its core principles.
- Experience with common front-end development tools such as Babel, Webpack, NPM,
yarn, etc. - Familiarity with newer specifications of EcmaScript (ES6)
- Ability to understand business requirements and translate them into technical
requirements - Good verbal and written communication skills
- Excellent analytical and problem-solving skills.
- Strong ability to drive end-to-end service design from usability and experience
perspectives. - Working knowledge of ADA and designing for users with unique physical and mental
challenges.
Desired Skills:
- Meeting with the development team to discuss user interface ideas and applications.
- Reviewing application requirements and interface designs.
- Identifying web-based user interactions.
- Developing and implementing highly-responsive user interface components using React
concepts. - Writing application interface codes using JavaScript following React.js workflows.
- Troubleshooting interface software and debugging application codes.
- Developing and implementing front-end architecture to support user interface concepts.
- Monitoring and improving front-end performance.
- Documenting application changes and developing updates
Someone with hands-on experience with Story Boarding, typography, Color, Vector illustration, Social Media Ads, carousels ,Image selection.
Other Design Skills, Layout Skills, Illustrations, Infographics, Creativity, Flexibility, Attention to Detail, Deadline-Oriented, Desktop Publishing Tools.
Strong liaison skills to deliver assignments in time.
Up to date with industry leading software and technologies (Illustrator, In Design, Corel Draw, Photoshop etc.)
Professionalism regarding time, costs and deadlines.
Proven graphic design skills, Backed by a strong portfolio.
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools: Experience with big
- data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

1. Strong knowledge in Front end scripting like EJS, JavaScript, Jquery.
2. Proficiency with fundamental front-end languages such as HTML, CSS.
3. Familiarity with JavaScript frameworks such as Angular JS, React, and Amber.
4. Proficiency with server-side languages such as Python / Ruby / Java / PHP/ .Net.
5. Good Understand with database technology such as MySQL, Oracle, and MongoDB.

- Understand PHP, JavaScript, and JQuery.
- Have deep knowledge of working with the platform APIs.
- Use version control for Shopify theme development.
- Possess excellent soft skills, such as communication and teamwork.
- Have an extensive portfolio and testimonials of happy clients.


Minimum 5+ years in react
Must have worked on desktop application
Must have worked in frontend and backend both
5+ years of experience with React js, work closely with design, product management, and development teams to create elegant, usable, responsive, and interactive interfaces across multiple devices; experience with Electron js framework is a plus.

Experience: 6+ Years
Essential Skills & Experience
- Must to have minimum of 3 years of experience in using Angular 2+ and above versions.
- Develop modern software applications and working with the team to solution definition and guide teams through execution and implementation
- Ensuring high performance on mobile and desktop
- Must have experience for writing tested, idiomatic, and documented Angular, JavaScript, HTML and CSS
- Must have knowledge of OOP concepts, industry best practices and architecture designs.
- Must have good understanding of web technology/enterprise level applications.
- Coordinating the workflow between the graphic designer, the HTML coder, and yourself
- Cooperating with the back-end developer in the process of building the RESTful API
- Communicating with external web services
- Experience should include structured source code management using SVN or GIT and build using MAVEN/Grunt.
- Good to have backend experience in any one of the backend technologies mentioned below:
- Java with Spring boot.
- PHP
- Need to have good experience in Web application designing.
- Good to have experience in Agile methodology.
Nice to Have
- Experience with web servers like Nginx or NodeJS
- Basic awareness of cloud orchestration frameworks like Kubernetes/Docker
- Understanding of Unit Testing Frameworks
BEHAVIORAL SKILLS
- Self-motivated and a Quick Learner
- Ability to consistently perform and meet deadlines
- Attention to detail and follow-through
- A good understanding of customer satisfaction
- Ability to work effectively in a team as well as in an individual environment
- Excellent written and verbal communication skills

