
Skills Required:
✔️ Strong expertise in data, AI, and cloud infrastructure.
✔️ Proficiency in Python, SQL, and basic scripting (Shell/Bash).
✔️ Experience working with Azure Databricks, GCP, or AWS.
✔️ Hands-on experience with Apache Spark, Hive, Hadoop, and Big Data ETL tools.
✔️ Knowledge of Jenkins, JIRA, Bitbucket, Bamboo, Azure DevOps, and Git.
✔️ Strong experience in SQL databases (Oracle, SQL Server, DB2) and NoSQL databases.
Key Responsibilities:
🔹 Develop and maintain scalable data pipelines from multiple data sources.
🔹 Collaborate with cross-functional teams to source, process, and integrate data efficiently.
🔹 Support in designing AI-driven data solutions for business intelligence and automation.
🔹 Partner with stakeholders to manage project risks and track delivery progress.
🔹 Ensure timely execution of project milestones, deadlines, and budgets.
🔹 Troubleshoot, fine-tune, and optimize large-scale data processing jobs for performance.

About CredenceSoft
About
Company social profiles
Similar jobs
- Develop and implement sales strategies for IT staff augmentation and consulting services.
- Identify and engage with potential clients, including IT firms, enterprises, and startups.
- Build and maintain long-term client relationships with hiring managers, procurement teams
- Understand client requirements for contract, contract-to-hire, and full-time IT staffing.
- Work closely with the recruitment team to ensure the timely delivery of qualified IT professionals.
- Negotiate contracts, pricing, and service agreements in line with business goals.
- Monitor market trends, competitor activities, and emerging IT staffing needs.
- Maintain an up-to-date sales pipeline, providing reports on revenue forecasts and client engagement.
Responsibilities:
We’re looking for someone who is enthusiastic, comfortable with uncertainty, flexible, great with people and isn’t afraid to roll up their sleeves and get their hands dirty. Every day is an opportunity to bring your vision to life, and to expand, learn, and grow. No idea is left unconsidered. No voice is left unheard.
- Supporting system design, development and maintenance
- Taking responsibility for personal technical quality standards within the team
- Assisting in defining structured practices, especially in source code management, building and deployment
- Optimising applications for maximum speed and scalability
- 2-6 years of experience
Skills
NodeJs
Desired Competencies (Technical/Behavioral Competency)
- . Strong knowledge of Splunk architecture, components, and deployment models (standalone, distributed, or clustered)
- Hands-on experience with Splunk forwarders, search processing, and index clustering
- Proficiency in writing SPL (Search Processing Language) queries and creating dashboards
- Familiarity with Linux/Unix systems and basic scripting (e.g., Bash, Python)
- Understanding of networking concepts and protocols (TCP/IP, syslog)
Key Responsibilities
- Deploy Splunk Enterprise or Splunk Cloud on servers or virtual environments.
- Configure indexing and search head clusters for data collection and search functionalities.
- Deploy universal or heavy forwarders to collect data from various sources and send it to the Splunk environment
- Configure data inputs (e.g., syslogs, snmp, file monitoring) and outputs (e.g., storage, dashboards)
- Identify and onboard data sources such as logs, metrics, and events.
- Use regular expressions or predefined methods to extract fields from raw data
- Configure props.conf and transforms.conf for data parsing and enrichment.
- Create and manage indexes to organize and control data storage.
- Configure roles and users with appropriate permissions using role-based access control (RBAC).
- Integrate Splunk with external authentication systems like LDAP, SAML, or Active Directory
- Monitor user activities and changes to the Splunk environment
- Optimize Splunk for better search performance and resource utilization
- Regularly monitor the status of indexers, search heads, and forwarders
- Configure backups for configurations and indexed data
- Diagnose and resolve issues like data ingestion failures, search slowness, or system errors.
- Install and manage apps and add-ons from Splunkbase or custom-built solutions.
- Create python scripts for automation and advanced data processing.
- Integrate Splunk with ITSM tools (e.g., ServiceNow), monitoring tools, or CI/CD pipelines.
- Use Splunk's REST API for automation and custom integrations
- Good to have Splunk Core Certified Admin certification
Splunk Development and Administration
- Build and optimize complex SPL (Search Processing Language) queries for dashboards, reports, and alerts.
- Develop and manage Splunk apps and add-ons, including custom Python scripts for data ingestion and enrichment.
- Onboard and validate data sources in Splunk, ensuring proper parsing, indexing, and field extractions.
Description of Role
- We are looking for an enthusiastic analytically minded data driven BI Consultant/Developer role in Mumbai based Data & Analytics team
- The Business Intelligence consultant/developer is a member of the analytics team whose focus is to provide hands-on consulting and
- · Understand and document the reporting business requirements, processes and workflows developing both written and visual depictions of requirements and process flows.
- · Work closely with BI product manager & lead dashboard design and information presentation ideation meetings with business stakeholders.
- Works extensively in Tableau Desktop, MS Power BI designing and building highly customized dashboards to provide stunning visual anal build assets like datasets, reports and dashboards using tools like Tableau, Powerbi & dataiku for internal customers within investment (Performance, Marketing, Sales etc.)
Key responsibilities
- · Builds Tableau/Power BI Data Models utilizing best practices, ensures data accuracy and ability to blend with other certified data sources.
- · Understand Analytics & Business Intelligence framework and partner with Analytics Technology & Enterprise reporting Technology team to build & deploy reports and dashboards.
- · Evaluate data sources and technology options, recommend appropriate solutions.
- · Build and maintain a reporting data dictionary and data lineage.
- · Perform data analysis, data mapping, and develop ad hoc queries.
- · Write SQL queries, stored procedures, and scripts, ETL jobs.
- · Perform QA testing of developed deliverables and assist with User Acceptance Testing
- · Manage the prioritization, logging and tracking of user related issues, defects, enhancements, and work requests.
Experience and Skills
- · Analytic and data driven mindset with a finesse to build dashboards that tell a story.
- · Strong communication skills, interacting effectively with quantitative colleagues as well as less technical audiences.
- · Minimum of 5 years of proven experience creating Dashboards, Reports and Visualizations with interactive capabilities.
- · Minimum of 5 years’ experience in business/data analysis and data preparation
- · Minimum 2 years of experience in working with big data & building data apps using Python.
- · Broad industry knowledge of investment management a plus
- · Excellent verbal and writing skills in English.
- · Excellent time management and ability to work under pressure to meet tight deadlines.
- · Maintain a strong commitment to quality.
Job Overview:
To generate revenue by selling telecommunications services and products to businesses and individuals, including mobile, internet, voice, and data solutions. The role involves building and maintaining strong customer relationships, meeting sales targets, and providing excellent customer service.
Job Description*
* Exceptional communication skills to provide support to customers & team members if and when required.
* Ability to lead a customer through the sales process, determining needs, and closing the sale.
* Strong foundational knowledge in information technology on data and voice solutions from telco service providers.
* Worked on solutions like MPLS, SD-WAN, SIP (Voice), Cloud Telephony, Data Centre, Mobility SIM Card, Cloud solutions (Azure/ AWS).
* Strong Interpersonal skills to liaise with other departments and people within the organization.
* Strong C level connects in the SME & Mid-market segment within the defined geography.
* Ability to lead by example and act with integrity.
* Min experience of 3 years Plus in hardcore IT sales & Telecommunication is mandatory.
Qualifications:
*Education: Bachelor's degree in Business, Marketing, Telecommunications, or a related field.
Experience in telecom sales, B2B/B2C sales, or a related field.
Skills:
* Strong communication, negotiation, and interpersonal skills.
* Ability to build relationships and handle objections.
* Knowledge of telecom products and services (mobile, internet, broadband, ILL, MPLS, PRI, SD Wan, Sip Trunk, etc.).
* Goal-oriented with a proven ability to achieve sales targets.
* Problem-solving skills and a customer-focused approach.
Benefits:
• Career Growth Opportunities: Emphasis on continuous learning, internal promotions, and opportunities for upward mobility within the organization.
• Performance-based Incentives: Bonuses, rewards, or recognition programs tied to meeting or exceeding performance goals.
• Professional Development: Access to training programs, certifications, and workshops to enhance skills and competencies.


As a Full Stack Developer; you should be comfortable around both front-end and back-end coding languages, development frameworks, and third-party libraries. You should also be a team player with a knack for visual design and utility.
Responsibilities:
- Work with development teams and product managers to ideate software solutions.
- Design client-side and server-side architecture.
- Build the front end of applications through appealing visual design.
- Develop and manage well-functioning databases and applications.
- Write effective APIs.
- Test software to ensure responsiveness and efficiency.
- Troubleshoot, debug, and upgrade software.
- Create security and data protection settings.
- Build features and applications with a mobile responsive design.
- Write technical documentation.
Requirements:
- Backend: Spring (JAVA), Spring Boot, Laravel (PHP), MySQL, NoSQL, NGINXPlus.
- Frontend: Angular 5+ Ngrx/store5
- Infrastructure: Google cloud platform (App Engine, CloudSQL, BigQuery, PubSub, Firebase Hosting), Pusher.io (WebSockets), Filestack, Postmark app, 4 Tools: Postman app, JIRA.
- Rest APIs, Microservices, Agile, Oauth, Message Queue, Git.
- 6 years proven experience as a Full Stack Developer or similar role.
- Experience working with service-oriented architectures and web-based solutions.
- Familiarity with common stacks.
- Knowledge of multiple front-end languages and libraries (e. g. HTML/ CSS, JavaScript, XML, jQuery).
- Knowledge of multiple back-end languages (e. g. C#, Java, Python) and JavaScript frameworks (e. g. Angular, React, Node.js ).
- Familiarity with databases (e. g. MySQL, MongoDB), web servers (e. g. Apache) and UI/UX design.
- Experience working on Agile processes (Scrum, Kanban).
- Experience working on the AWS technologies.
- Excellent communication and teamwork skills.
- Great attention to detail.
- Organizational skills.
- An analytical mind.
- Degree in B. tech/BE.


Introduction:
We are seeking a Full Stack Software Developer to join our innovative team, focusing on a project in the Food & Beverage (F&B) sector and contributing to an established Financial Services product. The ideal candidate will have extensive experience in both backend and frontend development, particularly with Python and ReactJS, and a proven ability to deliver mobile-responsive web applications.
Key Responsibilities:
- Develop and maintain web applications using Django or similar MVC frameworks.
- Design and implement mobile-responsive web pages using frontend technologies such as ReactJS.
- Collaborate with cross-functional teams to define, design, and ship new features.
- Ensure the performance, quality, and responsiveness of applications.
Required Skills and Qualifications:
- Minimum of 3 years of experience in full stack development.
- Proficient in Python and experience with Django or other MVC frameworks.
- Strong experience in building mobile-responsive web pages using frontend technologies such as ReactJS.
- Familiarity with RESTful APIs and server-side integrations.
- Understanding of the entire web development process (design, development, and deployment).
- Excellent problem-solving skills and ability to work under tight deadlines.
What We Offer:
- Competitive salary and benefits package.
- Opportunities for professional growth and development.
- A collaborative and innovative work environment.
Sr. Data Engineer
Company Profile:
Bigfoot Retail Solutions [Shiprocket] is a logistics platform which connects Indian eCommerce SMBs with logistics players to enable end to end solutions.
Our innovative data backed platform drives logistics efficiency, helps reduce cost, increases sales throughput by reducing RTO and improves post order customer engagement and experience.
Our vision is to power all logistics for the direct commerce market in India
including first mile, linehaul, last mile, warehousing, cross border and O2O.
Position: Sr.Data Engineer
Team : Business Intelligence
Location: New Delhi
Job Description:
We are looking for a savvy Data Engineer to join our growing team of analytics experts. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.
Key Responsibilities:
- Create and maintain optimal data pipeline architecture.
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centres and AWS regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
Qualifications for Data Engineer
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

Hello,
Greetings for the day !!!
We, Tridat Technologies Pvt Ltd hiring for Freshers & Experienced in our eCommerce vertical.
Ideal Candidate:
· Graduate & Diploma holder candidates can only apply
· Excellent English communication
· Basic MS Office Knowledge
· Internet-savvy
· Familiar with Online Shopping
· Flair for eCommerce
· Flexible to work in any shifts
· Web Research knowledge and online product research
*CANDIDATES RESPONSIBILITIES:*
1. Communication with Client based in India & US
2. Assist Customers with their orders and issues faced
3. Data Organization and Analysis
4. Data research through Google search Engine
5. Catalogue Management
6. Report Generation
*BASIC REQUIREMENTS:*
1. Excellent Written and Verbal communication skills
2. Good / working knowledge of MS Office tools (Excel, Word, Powerpoint)
3. Google search Engine knowledge
4. Knowledge of E-commerce business would be an added advantage
*Training provided below:*
Basic Excel - 4 days
Intermediate Excel - 3 days
Advance Excel - 3 days
Google - 3 days
E-commerce - 3 days
Induction
Interview after training
*Shifts:*
8am to 10pm - General Shift (any 9hours)
6pm to 7am - Night Shift (any 9hours depends on project)
*Benefits & Facilities:*
Pick and Drop Facility (depends on shift timings)
Performance Bonus
Quarterly based rewards and recognition
6 months to Annual based appraisal
*Processes in work*
Content moderation
Customer service
Item set up
Cataloging
Thanks & Regards
Shraddha Kamble


