
Pecksmart is looking to hire telecaller at our expereince centre
Key responsibilities:
High school diploma or equivalent.
Work experience as a Tele caller, Telemarketer, or a similar role in the Sales Department.
Professional certification in sales and marketing will be an advantage.
Great interpersonal skills.
Exceptional oral and written communication skills.
Strong organizational skills.
Ability to work in a team or individually as and when required.
Ability to manage and handle multiple tasks.
Exceptional attention to detail.
Hard-working individual.
Good time management abilities.
Strong decision-making skills.
Ability to tolerate stress and pressure.

About pecksmart
About
Similar jobs
Job role: Executive - Accounts
- Location: Ghatkopar - Mumbai
- Salary: 5-6LPA
Major Responsibilities:
• Hands on experience and knowledge in Tally ERP and exploiting all the tool and features provided therein - Group Company, Inventories, Branch accountings, master maintenance, cost centers, generation of reports, export\import excel PDF, consolidation, auto repo generation.
• Proficient in MS excel and automation.
• Monthly GST return preparation and filings.
• Payroll accounting - net pay, deduction, salary advances and reco with GL.
• PF\ ESI Accounting and Reconciliations
• Cash management, accounting for bank transactions and monthly BRS
• Inter Company accounting and reconciliation
• Processing of payments to Vendors, consultants and staff. Creditors aging and ledger confirmations of the balances
- Strong Sales Leadership Profile
- Mandatory (Experience 1) – Must have 6+ years of B2B Sales experience
- Mandatory (Experience 2) – Must have 2+ experience in SaaS, ERP, Garment-Tech, or Textile Manufacturing domains
- Mandatory (Experience 3) – Must have a proven track record of revenue target achievement and closing large enterprise deals (deal size of ₹50L+ ARR)
- Mandatory (Experience 4) – Must have a Minimum 1+ years experience of managing field sales teams across regions
- Mandatory (Skills) – Must have experience in CXO-level stakeholder management, with strong communication and ROI-based sales pitching
Preferred
- Preferred (Experience) – Experience leading sales in manufacturing hubs or regional clusters with frequent travel exposure
- Preferred (Education) – NIIFT alumni preferred; open to strong performers from related fields.

Summary:
We are seeking a highly skilled Data Engineer with extensive experience in designing, building, and maintaining complex data pipelines and architectures on the AWS platform. The ideal candidate will have a strong understanding of data warehousing, ETL/ELT processes, and big data technologies.
Responsibilities:
- Design, develop, and implement scalable data pipelines using AWS services (Glue, EMR, DMS, S3, Redshift, etc.).
- Architect and maintain data lakes and data warehouses to support business intelligence and analytics.
- Optimize data processing and storage for efficiency and cost-effectiveness.
- Implement data quality checks and monitoring to ensure data integrity.
- Collaborate with cross-functional teams (data scientists, analysts, software engineers) to understand data requirements and deliver solutions.
- Stay updated with emerging technologies and industry best practices in data engineering.
Qualifications:
- 7+ years of experience in data engineering or a related field.
- Strong understanding of data warehousing concepts and dimensional modeling.
- Hands-on experience with AWS services (Glue, EMR, Athena, DMS, S3, Redshift, EC2, Lambda, etc.).
- Proficiency in SQL and Python.
- Experience with big data technologies (Spark/PySpark, Kafka, etc.).
- Familiarity with CI/CD pipelines and DevOps practices.
- Excellent problem-solving and analytical skills.
- Ability to work independently and collaboratively within a team.
Good to Have:
- Experience with Snowflake, EMR, DataBricks, or other cloud data warehouses.
- Knowledge of data visualization tools (Tableau, Superset, Power BI, etc.).
- Experience in the healthcare or insurance industry
Adobe Workfront SME – Role Summary
Looking for a Subject Matter Expert in Adobe Workfront to lead solution design, client education, and technical execution across various projects. This role demands strong client interaction skills, in-depth technical proficiency in Workfront and Workfront Fusion, and the ability to translate business needs into scalable solutions.
Key Responsibilities
- Act as a recognized expert in Adobe Workfront solutions.
- Educate clients on best practices and industry standards.
- Analyze complex project challenges and recommend optimal solutions.
- Handle multiple client engagements simultaneously.
- Keep up with the latest Adobe Workfront developments.
- Maintain accurate records and proactively communicate project updates.
- Support enablement and coaching for clients and partners.
- Collaborate closely with internal and external stakeholders.
Requirements
- Strong command of Adobe Workfront and Workfront Fusion (module setup, scenario development, troubleshooting).
- Proven ability to turn business requirements into architectural and technical solutions.
- Skilled in integrations and communicating technical concepts to non-technical stakeholders.
- Excellent facilitation, communication, and presentation skills.
- Self-driven, adaptive, and solution-oriented with a collaborative mindset.
- Strong organizational and client management capabilities.
Salesforce DevOps/Release Engineer
Resource type - Salesforce DevOps/Release Engineer
Experience - 5 to 8 years
Norms - PF & UAN mandatory
Resource Availability - Immediate or Joining time in less than 15 days
Job - Remote
Shift timings - UK timing (1pm to 10 pm or 2pm to 11pm)
Required Experience:
- 5–6 years of hands-on experience in Salesforce DevOps, release engineering, or deployment management.
- Strong expertise in Salesforce deployment processes, including CI/CD pipelines.
- Significant hands-on experience with at least two of the following tools: Gearset, Copado,Flosum.
- Solid understanding of Salesforce architecture, metadata, and development lifecycle.
- Familiarity with version control systems (e.g., Git) and agile methodologies
Key Responsibilities:
- Design, implement, and manage CI/CD pipelines for Salesforce deployments using Gearset, Copado, or Flosum.
- Automate and optimize deployment processes to ensure efficient, reliable, and repeatable releases across Salesforce environments.
- Collaborate with development, QA, and operations teams to gather requirements and ensurealignment of deployment strategies.
- Monitor, troubleshoot, and resolve deployment and release issues.
- Maintain documentation for deployment processes and provide training on best practices.
- Stay updated on the latest Salesforce DevOps tools, features, and best practices.
Technical Skills:
- Deployment ToolsHands-on with Gearset, Copado, Flosum for Salesforce deployments
- CI/CDBuilding and maintaining pipelines, automation, and release management
- Version ControlProficiency with Git and related workflows
- Salesforce PlatformUnderstanding of metadata, SFDX, and environment management
- Scripting
- Familiarity with scripting (e.g., Shell, Python) for automation (preferred)
- Communication
- Strong written and verbal communication skills
Preferred Qualifications:
Bachelor’s degree in Computer Science, Information Technology, or related field.
Certifications:
Salesforce certifications (e.g., Salesforce Administrator, Platform Developer I/II) are a plus.
Experience with additional DevOps tools (Jenkins, GitLab, Azure DevOps) is beneficial.
Experience with Salesforce DX and deployment strategies for large-scale orgs.
Job Role - Power BI Lead
9 to 12 Years Experience Required.
Location - Pune Baner/ Vimaan Nagar
Work Model - Hybrid (Wednesday and Thursday WFO) 12 PM to 9 PM
Experience with Banking or GRC Domain is preferred.
- JOB SUMMARY
- Role Overview: We are seeking a highly skilled Power BI Expert to design, develop, implement and governance of Power BI solutions. Ideal candidate will have in-depth knowledge of Power BI architecture, data modeling, governance, embedded analytics and database management. The role requires expertise Power BI Data Gateways, report deployment, and governance frameworks ensuring scalable and secure data solutions.
- PRIMARY RESPONSIBILITIES
- Power BI Lead & Implementation:
- Design, develop, and deploy interactive Power BI reports and dashboards.
- Create efficient data models to optimize performance and scalability.
- Develop complex DAX expressions for business logic and calculations.
- Optimize report performance by using best practices in Power BI and SQL
- Power BI Architecture & Configuration:
- Configure and manage Power BI Data Gateways for secure and seamless data access
- Define and enforce Power BI workspace, dataset, and security policies.
- Implement row-level security (RLS) and data governance best practices.
- Establish data refresh schedules and ensure efficient data ingestion pipelines.
- Maintain and enhance Power BI Premium and Embedded solutions.
- Embedded Analytics & Integration:
- Integrate Power BI reports with external applications using Power BI Embedded.
- Work with Power BI REST APIs to automate workflows.
- Integrate Power BI with Oracle, SQL Server, MySQL, Microsoft Share point, Excel, Cloud data source etc.,
- Database & Performance Optimization:
- Write optimized SQL queries and stored procedures for report development.
- Ensure high-performance data refreshes and query execution.
- Work with ETL team to improve data integration with PowerBI
- Governance & Security:
- Define Power BI governance framework and best practices for standardization.
- Monitor user access, performance, and usage analytics to drive efficiency.
- Manage user roles, access controls, and data security.
- PowerApps & Power Automate (Nice to Have):
- Build PowerApps applications to extend Power BI functionality and create
- interactive business solutions
- Automate data flows and reporting updates using Power Automate (Flows,
- Triggers, Approvals, Notifications, etc.).
- Integrate Power BI, PowerApps, and Power Automate to create end-to-end
- business process automation.
- Stakeholder Collaboration
- Training:
- Work closely with business users, data engineers, and leadership teams to understand and document reporting requirements.
- Provide training and best practice guidance to Power BI users across the organization.
- Develop self-service Power BI frameworks to empower business teams for reporting.
- Troubleshoot Power BI performance and user issues.
Description
Do you dream about code every night? If so, we’d love to talk to you about a new product that we’re making to enable delightful testing experiences at scale for development teams who build modern software solutions.
What You'll Do
Troubleshooting and analyzing technical issues raised by internal and external users.
Working with Monitoring tools like Prometheus / Nagios / Zabbix.
Developing automation in one or more technologies such as Terraform, Ansible, Cloud Formation, Puppet, Chef will be preferred.
Monitor infrastructure alerts and take proactive action to avoid downtime and customer impacts.
Working closely with the cross-functional teams to resolve issues.
Test, build, design, deployment, and ability to maintain continuous integration and continuous delivery process using tools like Jenkins, maven Git, etc.
Work in close coordination with the development and operations team such that the application is in line with performance according to the customer's expectations.
What you should have
Bachelor’s or Master’s degree in computer science or any related field.
3 - 6 years of experience in Linux / Unix, cloud computing techniques.
Familiar with working on cloud and datacenter for enterprise customers.
Hands-on experience on Linux / Windows / Mac OS’s and Batch/Apple/Bash scripting.
Experience with various databases such as MongoDB, PostgreSQL, MySQL, MSSQL.
Familiar with AWS technologies like EC2, S3, Lambda, IAM, etc.
Must know how to choose the best tools and technologies which best fit the business needs.
Experience in developing and maintaining CI/CD processes using tools like Git, GitHub, Jenkins etc.
Excellent organizational skills to adapt to a constantly changing technical environment
Responsibilities:
The Senior Information Security Engineer is responsible for the implementation, execution and maintenance of technology solutions to mitigate risk, to protect the IT and Engineering environments by reducing the probability of, and to minimize the effects of, damage caused by malware, malicious activities and security events.
The individual will help protect the company by deploying, tuning, and managing security tools across the computing environment, as well as provide security incident response cycle support. They should have a passion and skills for identifying the latest cyber threats. The individual will:
Basic Qualifications
- Working knowledge of infrastructure-as-code and CI/CD pipelines tools (i.e. Jenkins, Teamcity, CircleCI etc..)
- Lead and participate in major day-to-day operational aspects of the security engineering team including improvement of current security controls while constantly identifying areas of needed improvement
- Deep hands-on security experience with cloud providers, such as AWS, GCP, Azure
- Understanding of automated security testing approaches and tools
- Experience with proactive integration of security into the development process
- Lead continuous improvement efforts of out security tools and systems (Concertation on SIEM, IDS, EDR Tools)
- Work with our customers (Security Operations, Incident Response, and Product teams) to incorporate high quality security alerting into their operational workflows
- Improve overall security practitioner efficiency through process automation
- Foster and promote collaboration among all members of the IT, Infrastructure, and Risk Management Departments.
Minimum Qualifications/Requirements
- BS or MS in Computer Science or related field
- Minimum 7+ years of cybersecurity experience
- Must have previous experience performing threat hunting and incident response duties using SIEM tools, cybersecurity management consoles, and ticketing systems
- Experience in deployment, development, and maintenance of SIEM
- Experience writing and using Ansible server administration scripts, and create simple Python, BASH, or Powershell scripts to automate cybersecurity functions
- Scripting experience to automate security operations, alerting, and compliance checks, CI/CD design, deployment, and management
- Experience with managing endpoint response and detection infrastructure and endpoints at the enterprise level, including performing upgrades to the back end application and deploying new agent versions to endpoints
- Understanding the investigative process and performing triage for cybersecurity incidents
- Experience maintaining industry leading security technologies or infrastructure systems in complex technical IT operations environment
- Must be detail-oriented and organized with ability to handle competing demands while meeting deadlines
- Experience in authentication protocols and frameworks to include OAuth, and AWS IAM
- Proactive and motivated; team player with a positive can-do attitude
- Strong analytical/problem-solving skills and cross-functional knowledge across multiple IT operational and security disciplines
- Ability to communicate technical concepts to a broad range of technical and non-technical staff
- Must possess a high degree of integrity, be trustworthy, and have the ability to lead and inspire change
Opportunity for Unix Developer!!
We at Datametica are looking for talented Unix engineers who would get trained and will get the opportunity to work on Google Cloud Platform, DWH and Big Data.
Experience - 2 to 7 years
Job location - Pune
Mandatory Skills:
Strong experience in Unix with Shell Scripting development.
What opportunities do we offer?
-Selected candidates will be provided training opportunities in one or more of following: Google Cloud, AWS, DevOps Tools and Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
- You would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- You will play an active role in setting up the Modern data platform based on Cloud and Big Data
- You would be a part of teams with rich experience in various aspects of distributed systems and computing.

