
The company is changing the way people explore cities around the world and have fun. Content in different formats plays a critical role in exposing our mission and offering to prospective guests at different stages of their buying funnel. We are looking for a Technical SEO Manager to leverage content and support our organic channel growth. SEO lies at the heart of our global growth strategy over the next few years. You'll be the point person for our SEO. Period. You'll be primarily responsible for fulfilling full spectrum SEO tasks and strategies from on-page, technical to off-site to conversion rate optimization strategies and link building.
Responsibilities
- Own technical site audits for primary and secondary websites.
Analyze website traffic data, linking structures, crawl budget, index coverage, and content to be able to make smart recommendations to increase website traffic, and conversion rates.
- Owning tactics that help grow the organic channel, data analysis and forecasting of
future growth.
- Analyze and report on SEO performance, owning weekly, monthly, and quarterly
reporting, and ad hoc analyses for tests and optimizations.
- Top-down keyword research and content gap analysis.
- Automate repetitive tasks, audits, processes wherever possible.
- Ensure pro-SEO-decisions are embedded within all aspects of our businesses.
Analyze points of failure and suggest optimisations to underlying processes used
by different teams to solve for them.
- Work closely with all facets of the organisation including product management,
engineering, and creative design to drive decision making for organic search
- Stay up-to-date on industry trends, the competitive landscape, and SEO strategies
to ensure we nurture best-in-class, value-add opportunities.
- Become a part of the ongoing mission to grow organic as a channel 100% every 6
months over the next 2 years.
We are looking for
- Minimum 3 years of Technical SEO experience with a proven track record with
improving organic rankings for eCommerce or related B2C businesses.
- Good technical understanding of website usability, server logs, structured data, XML
sitemaps, website performance, and indexation issues
- Functional knowledge of python or app scripts.
- Proficient in crawl management including log file analysis, Screaming Frog, Custom
crawlers
- Proficient with data analysis & web analytics
- Prior knowledge of SQL is a plus.
- Knowledge and ability to hand-code or understand valid semantic HTML, CSS and
JavaScript is a plus.

Similar jobs
About Us:
With a mission to upgrade the digital marketing industry with profound practical expertise and innovation, InSnap Technologies started its operations in 2018. Founded as a subsidiary of Spokesly INC based out of California, USA.
InSnap has grown to be a family of some of the best talent from around the world who have empowered thousands of businesses to achieve their digital marketing goals. Powered with business intelligence data, vision, and out-of-the-box thinking, InSnap provides novel solutions and tools to rest all the digital marketing worries for a B2B enterprise.
Job description:
Eligibility Criteria:
· Comfortable using AI tools (ChatGPT, MidJourney, Canva AI, Jasper, etc.) to create content, designs, and email copy.
· Willingness to learn SaaS marketing automation tools (ActiveCampaign, n8n, Chameleon etc.).
· Basic skills in copywriting, design, and analytics.
· Basic understanding of HTML & CSS
· Strong problem-solving ability and curiosity to experiment.
· Process-driven with an eye for detail (QA before things go live).
Key Responsibilities:
· Working on end-to-end email marketing operation including creating content, designing and putting emails in Automation.
· Executing email campaigns and tracking results.
· Optimizing Emails & Automation workflows for user activation, engagement, conversion and upgrades.
· Researching topics that interest our target audience the most and creating Lead Magnets on the same.
· Staying up to date with Advanced Marketing Strategies for Effective Campaigns.
· Working on different tools like Mixpanel, Chamelon etc. for improving user experience and activation.
Job Summary
We are looking for Senior-level of Paid Media Expert. You have to work for a single/Multiple ecommerce brand or at a digital marketing agency where they handle international clients and manage the team. You will need to analyzing campaign performance and making optimizations. You need to install tracking pixels.
What you will do:
● Create content which meets internal/clients standards● Research opportunities for new social marketing / Google Ad platforms.
● Review and approve content on a daily basis
● Create and manage monthly report
● Stay up to date with social media and google ad trends and best practices ● Collaborate with the team to create social media, Google Ad and video content.
● Monitor analytics with the rest of the team to identify viable ideas, trends, and growth patterns to grow insights.
● Prepare accurate reports on our marketing campaign's overall performance.
● Understand audience behavior and create segmentation.
● Positive attitude and a team player.
● Smart creative problem-solving skills.
Requirements
● You must have good knowledge of Google ads like Search, Display, Performance Max, video ads. Setup, manage and optimize campaigns on Google AdWords and YouTube.
● Develop and deliver social media optimization (SMO) and Social Media Marketing (SMM) like Facebook, Instagram, LinkedIn, Pinterest, twitter.
● Analyse analytics to identify areas of opportunity and utilize conversion rate optimization tactics (i.e. A/B, split testing etc.) to squeeze as much ROI as humanly possible out of each amount spent.
● Prepare performance analysis reports and make recommendations for corrective modifications with a view to ongoing optimization.
● Candidates must have good knowledge: Prepare and manage the digital marketing campaign - Facebook ads, Google ads, Instagram ads, LinkedIn, Pinterest - monitor revenue generation, website traffic, conversion & conversion rates.
● Coordinate with advertising and media experts to improve marketing results.
● Identify the latest trends and technologies affecting on Facebook Ads.
Perks
● 5 Days working
● Friendly & Supportive teammates.
● Festival & Birthday celebration.
● Great Work Culture.
● Leave encashment
● Performance-Based Monitoring & Appreciation.
Responsibilities:
- Good understanding of programming language
- Concepts of Java, Maven
- Source Control
- Logically Strong in coding practices
- Data structure Understanding.
- Security encryption and decryption
- Data encoding decoding
Requirements:
- Strong core Java development skills.
- Exposure to security tools.
- Experience in working with customers, understanding business requirements
- Good analytical and troubleshooting skills.
- Working knowledge of Jenkins/Maven
- Key requirements: Core Java, CI/CD, Docker, Maven, Jenkins, REST API, Kafka, Redis, cache, elastic cache
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Tasks and Responsibilities:
• Detailed technical analysis and specifications;
• Technical artifacts for new releases and automated tests as appropriate;
• Participation in meetings with the project team;
• Inputs to team lead for monthly activity and status reports of the team;
Profile:
• Bachelor or Master degree;
• +5 years of professional experience in software development;
• +3 years of hands-on Java or Node.js/Typescript development experience;
• Knowledge of JSF, Angular or Wicket;
• Knowledge of Docker, Solr/Elastic and/or Amazon Web Services (lambda, dynamodb, IAC, cognito, SNS, SQS, step-functions, cloud watch, log group, https codes used in REST api) is an advantage;
• Fluent in English
JD:
- Experience in maintaining Oracle Exadata, RAC, and Administration of Oracle Multitenant Architecture.
- Implemented below Exadata new features like Smart scan, Flash cache, flash log, IORM, Storage indexes
- Using CellCli or dcli to manage Exadata storage server objects.
- Monitoring cell disk, grid disk, cell flash cache, flash disk
- Configured dcli between cell nodes and database nodes.
- Monitoring Exadata storage servers with cell metrics alerts and active requests.
- Migrating legacy database to Exadata.
- Experience in Exadata patching
- Experience in RAC along with new features like Flex ASM and Flex cluster.
- Resolve the issues like replace damaged physical disk or Flash disk.
- Installing & configuring Oracle Cluster ware & Database Software for Oracle 12c and 19c.
- Extensive Knowledge on Oracle Multitenant Architecture 12c and 19c.
- Experienced in Oracle installations, upgrades, migration, cloning, designing logical/physical architecture
- Database Refresh using Data pump.
- Expertise in Database upgrades, patching for standalone & RAC databases.
- Experienced in Database Point in Time Recovery.
- Enforcing security by creating roles, granting system and object privileges on the Tables, and stored procedures with DBA concepts.
- Knowledge on oracle enterprise management
- Implementation of db compression using Exadata HCC
- Running ExaCheck reports monthly
- Implementing smart flash cache, enabling flash cache write back
- Hardware changes , flash disk, griddisk and DIMM replacements
- Working knowledge on managing NFS mount points.
- ZFS share creation
- Huge pages setting
Tagalys provides intelligent merchandising solutions to mid-sized e-commerce brands around the world, like LEGO, Fila, Crocs, Tink, Ana Luisa, Ritu Kumar, EQVVS, Lack of Color, and Apollo 247. Our products include category merchandising tools, search, recommendations, and a full analytics suite. Over the last three years, we have scaled from a two-person founding team to over 15 people. During this time, we’ve received amazing customer reviews on the quality of the product, and our customer support & success.
About the role
We are looking for a backend developer to help us execute our product roadmap faster. You will design, spec, develop, test, and deploy new features that can scale.
Your work could be related to:
- Defining new merchandising or reporting functionality
- Adding more advanced search capabilities
- Improving scoring/recommendation models
- eCommerce platform integrations/API implementations
- Load testing/scaling features as we continuously grow
- Something new. There are always interesting new challenges that come our way. Here are a couple of things we’ve worked on recently:
- Managing background jobs with competing performance goals like quick turnaround while staying within platform API limits
- Embedding and dynamically changing products in an email even after sending it
- Managing and coordinating multi data-center processes
Requirements
- 1-3 years of experience in Ruby On Rails (or a similar MVC framework like Express or Django). You are comfortable with Routes, Migrations, Models, Callbacks, Validations, and so on.
- Ability to design clean, scalable data structures for new features.
- A good understanding of design patterns so that complex processes are written in a maintainable way.
- Knowledge of managing source code with Git.
- Comfortable in a Linux OS.
- Good communication skills, and ability to clearly plan and describe features before building.
Benefits
- Challenging, exciting work, at scale
You get to understand the challenges our customers face first-hand, and collaboratively design and build amazing solutions that will be used by millions of shoppers through billions of API requests. There’s always something new and exciting to build. - Have a direct impact
Seven engineers currently build, refine, and scale the entire product. Working in a small team means you will have a say in the way the product is built, and your performance will have a direct impact on the company’s outcome. - Culture
Tagalys is a place for you to be yourself, enjoy your work, and achieve your potential. We love the energy that comes from working with smart people in a simple and welcoming environment. We design our workflow to make sure you are also able to focus on everything else in life that is important: family, friends, and health. - Benefits
You get great pay, coupled with flexible hours, a hybrid work model, and comprehensive insurance for your and your dependents. For the people contributing significantly, we also offer ESOPs so there is an opportunity to build wealth as you help the company grow.
standards, primarily Java, J2EE, Spring, Hibernate and tools including open source tools and platforms, web
services and open interfaces to build software that is state-of-the-art. Details as follows:
• Extensive experience with web applications using Java, J2EE, Spring MVC, Struts 2, Hibernate/JPA, JSP
• Extensive knowledge of Java, JVM tuning and troubleshooting. Knowledge of various J2EE and servlet
containers such as Weblogic and Tomcat
• Proficient in JSON, Spring, XML, Struts2 and web services (REST). Demonstrate knowledge and experience
working with APIs and SOA services
• Good knowledge of Web Services and related frameworks in Java like JAX-WS
• Extensive experience of the object-oriented analysis and design patterns/techniques with emphasis on
Java/J2EEtechnology
• Good Experience in JSP, AJAX using Spring MVC, JSON, JQuery, Bootstrap
• Should be aware of Spring MVC, controllers, interceptors, filters and other framework features
• Knowledge of Keycloak would be an advantage.










