We're looking for a passionate and data-driven person to own the SEO & SEM section of the marketing funnel for Beagle Security. You'll be in charge of all online acquisition marketing, managing the strategy, execution, and optimization across SEO & SEM with a strong focus on generating ROI.
Beagle Security is a SaaS-based web application penetration testing tool that helps companies identify vulnerabilities on their website & APIs before hackers exploit them. Currently used by 1500+ customers across 90+ countries to improve their website security.
Key Responsibilities
- Plan, implement and manage our SEO strategy
- Drive keyword strategy, discovery and expansion of keyword opportunities and link building initiatives
- Work with development team to ensure SEO best practices are properly implemented
- Track, report and analyze website analytics, pay-per-click (PPC) initiatives and campaigns
- Optimize copy and landing pages for paid marketing campaigns
- Collaborate with the marketing team to integrate and complement marketing strategies across multiple channels
- Proven results in SEO/SEM or similar role (Prior experience working in SaaS is a plus)
- Well-versed in performance marketing, conversion and customer acquisition
- Ability to analyze data and provide recommendations to improve our SEO/SEM performance
- Experience working with SEO, SEM and website analytics tools
- Familiar with A/B testing and running growth experiments
- Up to date with the latest SEO/SEM trends and best practices
- Critical thinker and problem-solving skills
- Ability to work independently with minimal supervision and comfortable working in a fast-paced environment

Similar jobs
- Strong Product Design / UX UI Design Profiles
- Mandatory (Experience 1) – Must have a 4+ years of hands-on experience designing digital products (UX/UI) across web and mobile platforms.
- Mandatory (Experience 2) – Must have a strong portfolio showcasing end-to-end design execution—user research, wireframing, prototyping, and high-fidelity UI work.
- Mandatory (Experience 3) – Must have proficiency in Figma, Adobe Creative Suite, and prototyping tools like Framer, Principle, or similar.
- Mandatory (Experience 4) – Must have worked directly with product managers, engineers, and founders to shape product vision and strategy
- Mandatory (Company) - Product companies only (B2B preferred
If interested please share your resume at ayushi.dwivedi at cloudsufi.com
Note - This role is remote but with quarterly visit to Noida office (1 week in a qarter) if you are ok for that then pls share your resume.
Data Engineer
Position Type: Full-time
About Us
CLOUDSUFI, a Google Cloud Premier Partner, is a global leading provider of data-driven digital transformation across cloud-based enterprises. With a global presence and focus on Software & Platforms, Life sciences and Healthcare, Retail, CPG, financial services, and supply chain, CLOUDSUFI is positioned to meet customers where they are in their data monetization journey.
Job Summary
We are seeking a highly skilled and motivated Data Engineer to join our Development POD for the Integration Project. The ideal candidate will be responsible for designing, building, and maintaining robust data pipelines to ingest, clean, transform, and integrate diverse public datasets into our knowledge graph. This role requires a strong understanding of Cloud Platform (GCP) services, data engineering best practices, and a commitment to data quality and scalability.
Key Responsibilities
ETL Development: Design, develop, and optimize data ingestion, cleaning, and transformation pipelines for various data sources (e.g., CSV, API, XLS, JSON, SDMX) using Cloud Platform services (Cloud Run, Dataflow) and Python.
Schema Mapping & Modeling: Work with LLM-based auto-schematization tools to map source data to our schema.org vocabulary, defining appropriate Statistical Variables (SVs) and generating MCF/TMCF files.
Entity Resolution & ID Generation: Implement processes for accurately matching new entities with existing IDs or generating unique, standardized IDs for new entities.
Knowledge Graph Integration: Integrate transformed data into the Knowledge Graph, ensuring proper versioning and adherence to existing standards.
API Development: Develop and enhance REST and SPARQL APIs via Apigee to enable efficient access to integrated data for internal and external stakeholders.
Data Validation & Quality Assurance: Implement comprehensive data validation and quality checks (statistical, schema, anomaly detection) to ensure data integrity, accuracy, and freshness. Troubleshoot and resolve data import errors.
Automation & Optimization: Collaborate with the Automation POD to leverage and integrate intelligent assets for data identification, profiling, cleaning, schema mapping, and validation, aiming for significant reduction in manual effort.
Collaboration: Work closely with cross-functional teams, including Managed Service POD, Automation POD, and relevant stakeholders.
Qualifications and Skills
Education: Bachelor's or Master's degree in Computer Science, Data Engineering, Information Technology, or a related quantitative field.
Experience: 3+ years of proven experience as a Data Engineer, with a strong portfolio of successfully implemented data pipelines.
Programming Languages: Proficiency in Python for data manipulation, scripting, and pipeline development.
Cloud Platforms and Tools: Expertise in Google Cloud Platform (GCP) services, including Cloud Storage, Cloud SQL, Cloud Run, Dataflow, Pub/Sub, BigQuery, and Apigee. Proficiency with Git-based version control.
Core Competencies:
Must Have - SQL, Python, BigQuery, (GCP DataFlow / Apache Beam), Google Cloud Storage (GCS)
Must Have - Proven ability in comprehensive data wrangling, cleaning, and transforming complex datasets from various formats (e.g., API, CSV, XLS, JSON)
Secondary Skills - SPARQL, Schema.org, Apigee, CI/CD (Cloud Build), GCP, Cloud Data Fusion, Data Modelling
Solid understanding of data modeling, schema design, and knowledge graph concepts (e.g., Schema.org, RDF, SPARQL, JSON-LD).
Experience with data validation techniques and tools.
Familiarity with CI/CD practices and the ability to work in an Agile framework.
Strong problem-solving skills and keen attention to detail.
Preferred Qualifications:
Experience with LLM-based tools or concepts for data automation (e.g., auto-schematization).
Familiarity with similar large-scale public dataset integration initiatives.
Experience with multilingual data integration.
Hi, Hiring for Fresher's for International voice process for Bangalore location.
For more details
Dinesh
Thanks
As a System Administrator at Kyndryl, you’ll solve complex problems and identify potential future issues across the spectrum of platforms and services. You’ll be at the forefront of new technology and modernization, working with some of our biggest clients – which means some of the biggest in the world.
There’s never a typical day as a System Administrator at Kyndryl, because no two projects are alike. You’ll be managing systems data for clients and providing day-to-day solutions and security compliance. You’ll oversee a queue of assignments and work directly with technicians, prioritizing tickets to deliver the best solutions to our clients.
One of the benefits of Kyndryl is that we work with clients in a variety of industries, from banking to retail. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. You’ll also get the chance to share your expertise by recommending modernization options, identifying new business opportunities, and cultivating relationships with other teams and stakeholders. Does the work get challenging at times? Yes! But you’ll collaborate with a diverse group of talented people and gain invaluable management and organizational skills, which will come in handy as you move forward in your career.
Key Responsibilities
- Initialize/Add new DASD volumes and add new tape volumes. Perform the space management activities such as VTOC resizing, Defrag, Compress/Reclaim/Release/Cleaning of datasets.
- Manage and Define SMS rules and ACS routines.
- Manage and Maintaining DFHSM & Catalog’s.
- Storage dataset (RMM/HSM CDS & Catalog file) reorganization.
- Experience in generating all types of storage reports, SMF and Dcollect.
- Management of DR environment for DR readiness & execution. Performing the Disaster recovery tests
- Storage box Configuration (DASD & Tape).
- Storage box data migration (DASD to DASD and Tape to tape)
- Disaster recovery planning & implementation (Tape recovery / DASD Replication / Tape Replication/Hyperswap).
- Disk replication setup using the CSM (Copy Services Manager) / GDPS.
Your future at Kyndryl
Every position at Kyndryl offers a way forward to grow your career, from Junior System Administrator to Architect. We have opportunities for Cloud Hyperscalers that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. One of the benefits of Kyndryl is that we work with clients in a variety of industries, from banking to retail. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here.
You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others.
Required Technical and Professional Expertise
- 6+ Years Experience in managing DASD and tape data storage in Mainframe environment
- Storage Management related knowledge under the z/OS: including DFSMS, ISMF, NaviQuest, DFSMSHSM, DFSMSdss, DFSMSdfp, DFSMSrmm, ICF Catalog, IDCAMs, ICKDSF, VSAM and Non-VSAM Datasets (PS/PDS/PDSE/GDG)
- Practical Z/OS Operating System, TSO, ISPF, JCL, VSAM, JES2 knowledge
- Vendor Products for storage management including IBM (TACM, TAAM), CA (CA Disk, CA-allocate and CA-1), FDR products, Storage monitoring tools (Omegamon XE for storage, BMC-SRM, CA-Vantage)
- Data extraction, analysis & reporting from Dcollect/Tape Management/BVIR reports/SMF
- Working Knowledge & experience in REXX for task simplification & automation
- Replication technologies including IBM Metro Mirror, IBM Global Mirror and other IBM Multi-site replication setup.
- Disk Replication management products including CSM & GDPS.
- Expertise in Storage hardware subsystems including IBM DASD (DS8K), IBM Automated tape library and IBM Virtual tape library (TS7700/TS7720/TS7740/TS7760/TS7770)
- Configuration of DASD (carving & connecting) & Tape hardware (setting up TS7700) for mainframes
- Disk migration using TDMF, FDRPAS. Disk migration using Replication technologies
- Tape migration using products/tools such as T2T, Tape/Assist, CopyCat. Tape Migration within TS7700 Grid to new Clusters.
- Problem solving: Recognize complex problems related to functional objectives
- Analyze situations and implement solutions, or develop new system elements, procedures or processes
- Creativity and judgment applied to developmental work on different projects within the business environment
Preferred Technical and Professional Experience
- Experience in EMC’s storage (VMAX, Powermax and DLM) and Replication (SRDF, GDDR)
- Experience in Hitachi’s storage (USPV, VSP) and replication (HUR, Shadow-Image, BCM)
- SAS experience, Mainframe Storage modernization related knowledge and skills including: Model9, TCT, CTC, ZEDC implementation
- Cyber resiliency & Security related knowledge & skills including CSM Based SafeGuarded Copy, GDPS LCP, IBM CyberVault, Pervasive Encryption
Job Position - BlockChain Developer
Experience - 4+ Years
Job Location - Bangalore
Responsibilities:
• Search, design, develop, and test blockchain technologies.
• Brainstorm and help evaluate applications for new tools and technologies as they continually evolve.
• Maintain and extend current client- and server-side applications responsible for integration and business logic.
• Be involved in the global blockchain community—work on implementing and integrating the latest improvement proposals.
• Document new solutions as well as maintain the existing ones.
Requirements:
• Total 4+ Years of experience in Backend Development and minimum 1+ years of experience in Ethereum, Blockchain
• Knowledge of any one of the programming languages: NodeJS, Java, GoLang, C, C++, PHP
• Basic understanding of blockchain technology: Bitcoin, Smart Contracts, Solidity, Cryptography
• Knowledge of private blockchains such as Hyper Ledger, and Corda is a plus
• Basic understanding of HTML, CSS, JS, Bootstrap, and knowledge of front-end technologies such as Angular/ React/ Vue is a plus.
• Knowledge in databases and database modeling: MySQL, PostgreSQL, MongoDB
• Understanding of Software Development Lifecycle (SDLC)
• Experience using Git for version control

• Experienced in designing and integrating RESTful APIs
• Knowledge of Python
• Excellent debugging and optimization skills
SKILLS
• 3-5 years of experience building large-scale software applications and working with large
software teams.
• Bachelor’s degree in computer science, information technology, or engineering
• Experience designing and integrating RESTful APIs
• Knowledge of Python and Backend Development
• Experience building Web/Mobile applications
• Excellent debugging and optimization skills
• Unit and Integration testing experience
• Being knowledgeable about engineering processes and good practices
• Passionate about learning new tools. Ability to continuously learn and acquire
knowledge.
• Able to adapt to changing complexity of tasks.
- 2-5 years of experience in the Android ecosystem
- knowledge UPI based apps
- Payment Gatway
- Solid understanding of Android SDK and Android architecture
- Strong OOPs fundamentals, Java, XML, JSON, Web Services, SQLite databases
- Proficient with developing optimized UI for different android versions and devices
- You should have released a few apps on the play store or at least have a working prototype of something which you are proud of
- You love to write beautiful code
- Strong analytical and troubleshooting skills
- You are resourceful, innovative, and inventive
- You have experience writing unit tests
Key Responsibilities:
· You’ll be actively involved with development of server backend that support mobile apps.
· You’ll be architecting and implementing best-in-class complex ‘real time’ web software and/or messaging systems to power highly scalable apps for users.
· Translate high level business problems into scalable design and code. Create libraries & Utilities for larger consumption.
· Work closely with UI/UX designers to create exciting user experiences and ensure delivery of graphic assets as per modern web standards.
· You’ll be continuously keeping an eye on the latest cutting-edge technologies and leveraging these in one’s own and the team’s work as necessary.
· Care about the business results of what you build, not just the elegance of the technology you build.
· Work on the end to end stack (Platforms, UI, distributed systems, databases) rather than specialize in one area.
· And of course, get hands dirty by writing server-side code for mobile-based applications, create robust high-volume production applications, and develop prototypes quickly.
Eligibility
Desired Candidates Profile:
• 5+ years of experience on Node.js based web applications and systems development.
• Strong knowledge of MEAN, Linux/UNIX based development & client-side JavaScript/jQuery.
• Knowledgeable on ORMs, their utility and limitations.
• Knowledge of web services and serialization techniques like REST, SOAP, XML & JSON.
• Knowledgeable about caching mechanisms & tools like memcache, CDNs, nginx.
• Scripting experience in using Shell/Python for creating quick technology solutions to problems.
• Prior expertise of working with AWS Cloud, CDNs and other PAAS based services.
• A strong penchant for Object-Oriented Design.
• Experience of working with version control, bug tracking, continuous integration and other productivity enhancement software like SVN, Bugzilla, Jira etc.
• Prior experience in implementing Agile software methodologies.
• Passionate about software development & modern-day web technologies like:
• Server-Side JavaScript – Node.js, Backbone.js
• Web Sockets
• NoSQL based databases like MongoDB/Couchbase/Redis
• Big Data
• Taking responsibility and ownership in the team’s work.
Additional Requirements:
• B-Tech/BS/BE/BS/MS/M.Tech/MS in Electronics or Computer Science from a premier institute in India (IITs, BITS, NITs etc) or abroad.









