
About Seclore
About
International hacking and state sponsored espionage are on the rise - and so are the technologies guarding the data. Are you in the game?
Seclore is an Information Security product company which has emerged as a global leader in the Data Centric Security space. Right from protecting nuclear submarine designs to new drug formulations to customer data of Fortune 100 organizations - we guard every kind of confidential information. Thousands of enterprises across 29 countries - including governments - can vouch for our technology.
Seclore’s patent pending, award winning technology allows users to control how their information is used - even after it is shared with people within or outside the enterprise. This requires stretching the technology boundaries beyond what might seem possible. It's not for the weak-hearted or run-of-the-mill developers.
Innovation is in our blood. From our early days within IIT Bombay to a globally recognized name in the field, this is one thing that has always taken us to new horizons.
Information Security is a fast-evolving field - testing the limitations of today's technologies. It is the need of the hour for every enterprise - from Fortune 500 companies to military organizations. At Seclore, you get to be part of the front lines - defending today's data against tomorrow's threats.
Connect with the team
Company social profiles
Similar jobs

Job Summary:
We are looking for a motivated and detail-oriented Data Engineer with 1–2 years of experience to join our data engineering team. The ideal candidate should have solid foundational skills in SQL and Python, along with exposure to building or maintaining data pipelines. You’ll play a key role in helping to ingest, process, and transform data to support various business and analytical needs.
Key Responsibilities:
- Assist in the design, development, and maintenance of scalable and efficient data pipelines.
- Write clean, maintainable, and performance-optimized SQL queries.
- Develop data transformation scripts and automation using Python.
- Support data ingestion processes from various internal and external sources.
- Monitor data pipeline performance and help troubleshoot issues.
- Collaborate with data analysts, data scientists, and other engineers to ensure data quality and consistency.
- Work with cloud-based data solutions and tools (e.g., AWS, Azure, GCP – as applicable).
- Document technical processes and pipeline architecture.
Core Skills Required:
- Proficiency in SQL (data querying, joins, aggregations, performance tuning).
- Experience with Python, especially in the context of data manipulation (e.g., pandas, NumPy).
- Exposure to ETL/ELT pipelines and data workflow orchestration tools (e.g., Airflow, Prefect, Luigi – preferred).
- Understanding of relational databases and data warehouse concepts.
- Familiarity with version control systems like Git.
Preferred Qualifications:
- Experience with cloud data services (AWS S3, Redshift, Azure Data Lake, etc.)
- Familiarity with data modeling and data integration concepts.
- Basic knowledge of CI/CD practices for data pipelines.
- Bachelor’s degree in Computer Science, Engineering, or related field.
Looking to launch your career in Sales & Marketing? NuVeda is the perfect launchpad for you!
Who we are?
A two-time award winner for the best product from eLearning Industry (2022), NuVeda established in 2006, is a fast-growing SaaS company with offices in Minneapolis, Chennai & Bangalore.
What do we do?
As a strategic growth partner in learning and development, we help our Customers Design, Deliver & Manage all their learning interventions at scale, Measure the business impact and Monetize the learning assets.
Our Vision & Culture
With a vision to build the "Google of Learning", we thrive in an open and honest organization culture where Autonomy, Alignment to Purpose, Integrity, Continuous Learning and Development are fostered to be future ready.
Why us?
We promise you an ambitious career path, compensation & benefits as per the industry standards and a team of high pots who pushes you to your limitless potential.
Role Summary
As a Sales Development Representative, you’ll play a crucial role in NuVeda’s Sales & Pipeline Generation Activities. You’ll be working closely with the Sales & Marketing Team to design outbound campaigns & generate qualified leads that will result in new business opportunities.
Responsibilities
- Research, target, and identify new client opportunities on an ongoing basis.
- Develop targeted/personalized messaging for outbound cold calls, email, and LinkedIn outreach.
- Qualify prospects by understanding customer needs and budgets.
- Maintain a detailed database on the CRM with all customer communications.
- Collaborate with teammates and achieve quarterly milestones.
Desired Skills
- Any graduate who is ambitious, energetic, and highly motivated to learn and grow in a fast-paced environment.
- 1-2 years of relevant experience in B2B SaaS Lead Generation is preferred.
- Knowledge of the SaaS ecosystem is preferred.
- Record of meeting and exceeding quotas and understanding of conversion metrics.
- Working experience in using social media tools (LinkedIn), lead generation tools (Apollo.io) and CRM tools (Zoho or others).
- Ability to work on omnichannel lead generation tactics.
- Ability to launch & orchestrate new content-based Sales Development campaigns.
- Excellent communication skills.




LogiNext is looking for a technically savvy and passionate Principle Engineer - Data Science to analyze large amounts of raw information to find patterns that will help improve our company. We will rely on you to build data products to extract valuable business insights.
In this role, you should be highly analytical with a knack for analysis, math and statistics. Critical thinking and problem-solving skills are essential for interpreting data. We also want to see a passion for machine-learning and research.
Your goal will be to help our company analyze trends to make better decisions. Without knowledge of how the software works, data scientists might have difficulty in work. Apart from experience in developing R and Python, they must know modern approaches to software development and their impact. DevOps continuous integration and deployment, experience in cloud computing are everyday skills to manage and process data.
Responsibilities :
Adapting and enhancing machine learning techniques based on physical intuition about the domain Design sampling methodology, prepare data, including data cleaning, univariate analysis, missing value imputation, , identify appropriate analytic and statistical methodology, develop predictive models and document process and results Lead projects both as a principal investigator and project manager, responsible for meeting project requirements on schedule and on budget Coordinate and lead efforts to innovate by deriving insights from heterogeneous sets of data generated by our suite of Aerospace products Support and mentor data scientists Maintain and work with our data pipeline that transfers and processes several terabytes of data using Spark, Scala, Python, Apache Kafka, Pig/Hive & Impala Work directly with application teams/partners (internal clients such as Xbox, Skype, Office) to understand their offerings/domain and help them become successful with data so they can run controlled experiments (a/b testing) Understand the data generated by experiments, and producing actionable, trustworthy conclusions from them Apply data analysis, data mining and data processing to present data clearly and develop experiments (ab testing) Work with development team to build tools for data logging and repeatable data tasks tol accelerate and automate data scientist duties
Requirements:
Bachelor’s or Master’s degree in Computer Science, Math, Physics, Engineering, Statistics or other technical field. PhD preferred 8 to 10 years of experience in data mining, data modeling, and reporting 5+ years of experience working with large data sets or do large scale quantitative analysis Expert SQL scripting required Development experience in one of the following: Scala, Java, Python, Perl, PHP, C++ or C# Experience working with Hadoop, Pig/Hive, Spark, MapReduce Ability to drive projects Basic understanding of statistics – hypothesis testing, p-values, confidence intervals, regression, classification, and optimization are core lingo Analysis - Should be able to perform Exploratory Data Analysis and get actionable insights from the data, with impressive visualization. Modeling - Should be familiar with ML concepts and algorithms; understanding of the internals and pros/cons of models is required. Strong algorithmic problem-solving skills Experience manipulating large data sets through statistical software (ex. R, SAS) or other methods Superior verbal, visual and written communication skills to educate and work with cross functional teams on controlled experiments Experimentation design or A/B testing experience is preferred. Experience in leading a team required.
Requires 9 hours
-Work from Office( Balewadi High Street) Involved
Day-to-day responsibilities include:
1. Research and post on a LinkedIn community we are building
2. Conduct market research into social media trends, influencers, ETC
3. Explore AI tools & SaaS tools that can automate and scale current efforts
4. Come up with creative strategies to improve engagement/impressions
5. Research on competitors' outreach strategies.

Desired Experience
3+ years
Job Description
What You’ll Do
- Taking an active role in architecting the solutions we build
- Designing and implementing web applications using JavaScript and its frameworks (i.a. React, Vue, Angular), HTML5 and CSS3
- Working closely with the dispersed development team, client, and project manager in SCRUM methodology
- Ensuring that programming practices and code quality are respected
- Supporting other engineers with your expertise when needed (knowledge sharing)
- Leading your colleagues’ growth and development
What you bring to the table
- 3+ years of Experience with React. Good to have a working knowledge of React Native, Angular, VueJS
- Strong expertise with HTML, CSS, and writing cross-browser-compatible code.
- Good understanding of AJAX and JavaScript DOM manipulation Techniques
- Experience with RESTful services
- Experience in JavaScript build tools like grunt or gulp
- Hands-on and implements complex React modules
- Able to implement automated testing platforms and unit tests
- Understanding of fundamental design principles behind a scalable application
- Proficient understanding of code versioning tools (Git)
- Critical thinker and problem-solving skills
Bonus if you have…
- Experience with Angular2+ and Node
- Experience with AWS or GCP infrastructure
- Curiosity about new languages, libraries, frameworks
- Contributions to open-source projects
- Experience with Selenium, or Jest or similar front end test frameworks


Job Description:
- Build efficient, testable & reusable Modules.
- Being Innovative to accept new tech trends.
- Knowledge in API, Core PHP, Codeigniter, Laravel, CodeIgniter etc.
- Collaborate with other team members and stakeholders.
- Strong understanding of database concepts, methodologies, source control and bug tracking.
- Presenting ideas for system improvements
Job Types: Full-time
Salary: As Per Company Norms
Education: Bachelor’s (Required),Master’s (Preferred)
Location: Indore, Madhya Pradesh (Required)
Best Regards,
HR Department
Office Add-519, 6th floor, Onam Plaza, Near Industry House, AB Road, ndore, Madhya Pradesh, 452001
- Design, build and maintain efficient, reusable, and reliable Java code.
- Providing technical oversight to the team, involved in design and code review.
- You would also be spending most of the time in development activities in varied technologies and should have a passion to write code.
- Should be able to create good technical documents.
- Translate application storyboards and use cases into functional applications.
- Ensure the best possible performance, quality, and responsiveness of the applications.
- Identify bottlenecks and bugs, and devise solutions to these problems - Help maintain code quality, organization, and automatization.
- Prepare the technical design of complex technology components as well as suggest the pros and cons of using a certain technology stack or component or design pattern versus another, to the clients.
- Team Management.
Required Skills and Qualifications
- Qualifications: BTECH/MTECH/MCA/MSc.
- Proficient in Java, with a good knowledge of its ecosystems with a knack for writing clean, readable Java code, writing reusable Java libraries along with knowledge of multithreading, concurrency patterns, and collections in Java.
- Solid understanding of object-oriented programming along with various design and architectural patterns.
- Hands on experience with JMS, JPA, Spring (MVC, Boot & Cloud preferred) & Hibernate.
- Familiarity with concepts of MVC, JDBC, and RESTful. Experience with Presentation Layer (JSP/Servlets), JS Frameworks (Angular, jQuery, react, etc.).
- Creating database schemas that represent and support business processes and experience with both external and embedded databases. Implementing automated testing platforms and unit tests.
- Proficient understanding of code versioning tools, such as Git, Ant, Maven, and Gradle; continuous integration.
- Knowledge in xml based mappings, SAML, rest clients, CAS authentication and Jetty.
- Knowledge of Apache Camel and Kafka and Drools Rule Engine is preferred.
Role Purpose:
As a DevOps , You should be strong in both the Dev and Ops part of DevOps. We are looking for someone who has a deep understanding of systems architecture, understands core CS concepts well, and is able to reason about system behaviour rather than merely working with the toolset of the day. We believe that only such a person will be able to set a compelling direction for the team and excite those around them.
If you are someone who fits the description above, you will find that the rewards are well worth the high bar. Being one of the early hires of the Bangalore office, you will have a significant impact on the culture and the team; you will work with a set of energetic and hungry peers who will challenge you, and you will have considerable international exposure and opportunity for impact across departments.
Responsibilities
- Deployment, management, and administration of web services in a public cloud environment
- Design and develop solutions for deploying highly secure, highly available, performant and scalable services in elastically provisioned environments
- Design and develop continuous integration and continuous deployment solutions from development through production
- Own all operational aspects of running web services including automation, monitoring and alerting, reliability and performance
- Have direct impact on running a business by thinking about innovative solutions to operational problems
- Drive solutions and communication for production impacting incidents
- Running technical projects and being responsible for project-level deliveries
- Partner well with engineering and business teams across continents
Required Qualifications
- Bachelor’s or advanced degree in Computer Science or closely related field
- 4 - 6 years professional experience in DevOps, with at least 1/2 years in Linux / Unix
- Very strong in core CS concepts around operating systems, networks, and systems architecture including web services
- Strong scripting experience in Python and Bash
- Deep experience administering, running and deploying AWS based services
- Solid experience with Terraform, Packer and Docker or their equivalents
- Knowledge of security protocols and certificate infrastructure.
- Strong debugging, troubleshooting, and problem solving skills
- Broad experience with cloud hosted applications including virtualization platforms, relational and non relational data stores, reverse proxies, and orchestration platforms
- Curiosity, continuous learning and drive to continually raise the bar
- Strong partnering and communication skills
Preferred Qualifications
- Past experience as a senior developer or application architect strongly preferred.
- Experience building continuous integration and continuous deployment pipelines
- Experience with Zookeeper, Consul, HAProxy, ELK-Stack, Kafka, PostgreSQL.
- Experience working with, and preferably designing, a system compliant to any security framework (PCI DSS, ISO 27000, HIPPA, SOC 2, ...)
- Experience with AWS orchestration services such as ECS and EKS.
- Experience working with AWS ML pipeline services like AWS Sagemak




