
- Develop, implement and manage our social media strategy
- Define most important social media KPIs
- Manage and oversee social media content
- Measure the success of every social media campaign
- Stay up to date with latest social media best practices and technologies
- Use social media marketing tools such as Buffer
- Attend educational conferences
- Work with copywriters and designers to ensure content is informative and appealing
- Collaborate with Marketing, Sales and Product Development teams
- Monitor SEO and user engagement and suggest content optimization
- Communicate with industry professionals and influencers via social media to create a strong network
- Hire and train other in the team
- Provide constructive feedback
- Adhere to rules and regulations
- Present to Senior Management

About TheWhitePole
About
Similar jobs
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Apply only if you can join within 2 week of selection.
Ideal for candidates who would like to be part of the learning process across growing technologies and embark journey on a tech architect path. Very deep into technologies and eager to learn and adapt with technology advancement
Job Description:
• Proficient ASP.NET/.NET CORE, Web API, MVC, Entity framework and MS SQL/MySQL is mandatory.
• Experience with HTML5/CSS3, JavaScript/jQuery, Angular; web-service: SOAP, RESFUL, MVC API
• Willing to learn new technologies
• Have good knowledge or experience with Agile/SCRUM development method
• Detail oriented, self-motivated and disciplined, with ability to complete assigned tasks with minimal oversight in a professional and timely manner is a must
|| #5daysworking || #flexibleworkinghours || #employeeorientedpolicy || #attractivebonus
Hands-on experience on advance excel and effective communication skills
Preferred Location Mumbai
We are hiring MIS Executive at Jaro Education. If you think you are the best match for the job do contact us.
Requirements:
- Developing and implementing user interface components using React.js concepts and workflows.
- Developing user interface components and implementing them with a well-known React.js workflow.
- Building reusable components and front-end libraries for future use.
- Translating designs and wireframes into high-quality code
- In-depth knowledge of JavaScript and React concepts, excellent front-end coding skills, and a good understanding of progressive web applications.
- Optimizing components for maximum performance across a vast array of web-capable devices and browsers.
- Develop a flexible and well-structured front-end architecture, along with the APIs to support the applications
●Solving questions of the students from across the globe on the TutorBin board Working on tasks involving various subjects/software related to undergraduate /postgraduate courses of Electronics and Communication Engineering.
●Reviewing the works completed by the tutor on our platform and providing necessary instructions for rectification as required.
●Ensuring the overall quality of work provided to the students from our platform.
●Management of the tutors on our platform, their on-boarding and performance review.
●Planning & implementing the training of the new tutors on our platform.
Skills
●Proper understanding of different programming softwares/subjects related toundergraduate courses of Electronics and Communication Engineering domain.
●Knowledge ofMultisim, LTspice, Pspice and Proteus will be a plus.●Excellent interpersonal and communication skills
●Problem solving attitude with good command on logical reasoning skills
●Ability to work independently with minimal supervision
●Keen to learn about new tools & technologies for use in changing situations.
●Comfortable in working in a fast paced environment with great efficiency.
Education: B.E. / B.Tech/ M.Tech in Electronics and Communication Engineering
Culture @ TutorBin
●Refreshments - Snacks served during working hours.
●Celebrations & fun activities - We value our employees and their tireless effort in building each block of the company. Regular company get-togethers are organised in order to makea very healthy working environment and greater bonding between the members.
●Always listening - You have got an innovative idea? You can pitch it upfront and work onit to implement as well.
Primary Responsibilities
- Design, architect and develop advanced software solutions in a cross functional Agile team supporting multiple projects and initiatives
- Collaborate with product owners and/or the business on requirements definition, development of functional specifications, and design
- Collaborate on or lead development of technical design and specifications as required
- Code, test and document new applications as well as changes to existing system functionality and ensure successful completion
- Take on leadership roles as needed
Skills & Requirements
- Bachelor’s Degree required, preferably in Computer Science or related field
- 3+ years of software development experience using GoLang/Java programming language
- Experience with cloud technologies (AWS/Azure/GCP/Pivotal Cloud Foundry/any private cloud) and containerization is required
- Experience with a micro-services architecture is a plus
- Excellent communication, collaboration, reporting, analytical and problem solving skills
- Experience with PostgreSQL or other Relational Databases
- Test-driven development mindset and a focus on quality, scalability and performance
- Strong programming fundamentals and ability to produce high quality code
- Solid understanding of Agile (SCRUM) Development Process required
Job Summary
We are looking for a PHP Developer responsible for managing back-end services and the interchange of data between the server and the users. Your primary focus will be the development of all server-side logic, definition and maintenance of the central database, and ensuring high performance and responsiveness to requests from the front-end. You will also be responsible for integrating the front-end elements built by your co-workers into the application. Therefore, a basic understanding of front-end technologies is necessary as well.
- Minimum experience should be 5 years.
- Strong knowledge of PHP frameworks (such as OpenCart, Zend)
- Advance understanding of front-end technologies, such as JavaScript & JS based frameworks like jquery.
- Understanding of MVC design patterns.
- Good hands on in integrating payment API's and CRM software development.
- Preference for experience on rest API's.
- Proficient understanding of code versioning tools, such as Git.








