

Position Responsibilities:
This role is part of the consulting team and will be responsible for performing data migrations from legacy and third-party databases, as well as data changes and data corrections. This role is customer-facing and is expected to conduct customer phone calls and presentations. Verifying the accuracy and completeness of all projects, to ensure quality work, is a crucial part of this role. In addition, this role is expected to contribute to the maintenance and enhancement of Deltek’s data migration tools and documents, as well as provide feedback and solutions to improve the scalability and overall implementation experience. The ideal candidate will be quality-focused while managing multiple priorities in a fast-paced environment to meet business and technical demands across a broad spectrum of customers from multiple industry verticals.
Job Duties:
- Performs data migrations, data changes, and data corrections.
- Develop and maintain extract, transform, and load processes (e.g. Python, Airflow, SSIS) that handle diverse source systems and high-volume datasets.
- Learn proprietary data tools for existing and new products. This includes understanding each product’s concepts, methods, procedures, technologies, and systems.
- Build and govern a centralized repository of reusable, product-agnostic SQL queries, views, stored procedures, and functions.
- Drive Git-based version control, code reviews and CI/CD for database artefacts.
- Assists customers and consultants with cleaning up existing data and building substitution tables to prepare for data migrations and changes.
- Verifies accuracy and completeness of converted data.
- Lead root-cause investigations and collaborate on corrective actions with the broader team.
- Provides thorough and timely communication regarding issues, change requests, and project statuses to the internal and external project team, including customers, project coordinators, functional consultants, and supervisors.
- Displays excellent communication skills, including presentation, persuasion, and negotiation skills required in working with customers and co-workers, including the ability to communicate effectively and remain calm and courteous under pressure.
- Author clear technical documentation (e.g. ETL designs, SQL library guidelines)
- Documents all data migrations, changes, and corrections accurately and completely in conversion delivery documentation.
- Works collaboratively in a team environment with a spirit of cooperation.
- Mentor colleagues on SQL techniques and ETL patterns, fostering a culture of continuous learning.
- Provides exceptional customer service.
- Estimates time and budget requirements for data migrations and changes.
- Works within budget and timeframe requirements for completion of each project.
- Handles changes in scope through documentation in conversion delivery documentation, statement of work, and change orders.
- Creates and maintains a procedures manual for data migrations, changes, and fixes
- Files TFS tickets and communicates to the development team any bugs and enhancements needed for new and existing data tools to improve accuracy, timeliness, and profitability of work.
Qualifications:
Essential Skills & Experience
- SQL Mastery: 5+ years crafting complex, high-performance T-SQL across platforms (e.g. SQL Server, PostgreSQL, Oracle).
- ETL Expertise: Demonstrated success designing, building and operating end-to-end ETL/ELT pipelines in enterprise environments.
- Scripting & Orchestration: Proficiency in at least one language (e.g. Python (preferable), PowerShell) and familiarity with orchestration tools (Airflow, Azure Data Factory, etc.).
- Version Control & CI/CD: Strong Git workflows experience and automated deployment pipelines for database objects.
- Client-Facing Experience: Comfortable engaging with clients to capture requirements, present technical options and deliver projects to schedule.
- 3+ years of hands-on experience working in a customer-facing role
- Advanced written and verbal communication skills using English
- Advanced conflict management and negotiation
- Advanced troubleshooting and problem-solving skills
- Basic functional knowledge of Deltek VantagePoint – For validating purposes
- Basic understanding of accounting principles

About Deltek
About
Project-based businesses transform the world we live in. Deltek innovates and delivers software and solutions that power them to achieve their purpose. Our industry-specific software and information solutions maximize our customers' performance at every stage of the project lifecycle by enabling superior levels of project intelligence, management and collaboration.
Deltek is the recognized global standard for project-based businesses across government contracting and professional services industries, helping more than 30,000 organizations of all sizes deliver on their mission.
With over 4,200 employees worldwide, our team of industry experts is passionately committed to creating exceptional customer experiences.
Similar jobs
A primary source of raw materials for manufacturers globally. A raw material sourcing platform with a mission to be the primary source of raw materials for manufacturers globally. Their cross border supply chain and tech solutions ensures manufacturers have access to the best quality raw materials at the right price.
Job Description
As a Backend Engineer you will be responsible for creating REST APIs which are used to drive the User Interface. Given the nature of the application, these APIs need to be very efficient and high performing.
This requires optimizing queries for faster execution and introducing database changes that may be required. We are looking for individuals with great attention to detail, who are genuine, confident, committed and not only passionate about technology, but excited to work in a fun and friendly start-up environment. T
he ideal candidate will be passionate about technology and GETTING IT DONE.
Responsibilities include:
Develop, test, implement and maintain application software
Take part in software and architectural development activities
Debug application issues and helping support respond to client queries
Participate in application development meetings
Provide accurate estimates, clearly communicate status of tasks and identification of risks
Commitment to accomplishing the task at hand and identify the fastest and most reliable way to solve a problem
Performance tuning of application code at different service levels
Interact with customers of the application and help address issues reported
ESSENTIAL SKILLS / EXPERIENCE REQUIRED
- Bachelor's degree in Computer Science or equivalent
- 1-3 years of experience with Java and Spring frameworks.
- Having good knowledge of a scripting language like python is a plus
- Experience with Spring and Hibernate/ ORM
- Understanding of relational databases and normal forms
- Understanding of NoSQL / RDBMS (Mongo or Postgres) and ability to write optimized and high-performing queries
- Strong understanding of Java concurrency, concurrency patterns, experience building thread-safe code
- Experience building RESTful web services
- Strong written and verbal communication skills
- Strong interpersonal skills and time management skills
- Strong problem-solving and analytical skills
- Experience with GIT as a VCS, Unix-based systems
- Experience with NoSQL Database is a plus Following are a set of 'good to have' skills
- React AWS Bitbucket, JIRA
Role Description
This is a full-time remote role for a SFMC Developer at Cloudsheer. The SFMC Developer will be responsible for implementing projects, optimizing performance, and ensuring reliability in digital solutions. Day-to-day tasks include coding, testing, and maintaining Salesforce Marketing Cloud platforms.
Qualifications
- Proficiency in Salesforce Marketing Cloud (SFMC) development
- Experience in coding, testing, and maintaining SFMC platforms
- Knowledge of SQL, AMP script, and HTML for SFMC implementation
- Understanding of digital marketing strategies and automation
- Strong problem-solving and analytical skills
- Excellent communication and collaboration abilities
- Ability to work independently and in a remote team environment
- Bachelor's degree in Computer Science, Information Technology, or related field

Requirement :
- 8+ years of experience in Software Development, design, implementation web applications (i.e. having an architectural sense)
- Hands-on analysis, design, and implementation of microservices based applications, .Net Core, Angular, React, Web API, Entity Framework, JavaScript, SQL Server.
- Must have exposure in website development using Angular/React
- Knowledge on Docker, Kubernet
- Hands-on experience working in Azure.
- Hands-on SOLID principles & architectural design patterns
- MS-SQL server knowledge
- Experience in Application Security and Performance tuning.
- Good to have experience in creating automated build and test environments / CICD/ DevOps
- Experience in defining and implementing Hybrid scenarios with workloads shared across on-premise and public cloud using zero trust approach.
- Highly effective team-working skills, with an established track record in building teams at all technical levels
- Research-oriented, flexible, open to change, and adapt quickly.
- Passionate to work on mission-critical tasks and solve real-world problems
- Healthcare Domain knowledge is a big plus
Responsibilities :
- Analysis, design, and implementation of microservices based applications using Microsoft and cloud technology stack.
- Understand the benefits of the various tech-stacks and accurately suggest the use of the appropriate technology for the applications.
- Develop future-ready solutions/PoCs and train the developers.
- Provide technical direction & solutions to the development team.
- Establish effective controls for processes (tech standards, code review, release management) and functions performed by the team.
- Effectively work with internal IT teams to help establish the necessary infrastructure for the platform appropriate for business volumes.
- Effectively work on multiple, parallel projects using exceptional organizational and time management techniques to successfully complete tasks in a timely manner
LogiNext is looking for a technically savvy and passionate QA Engineer to cater to the testing efforts in the domain of manual testing. You will help the team in building an awesome manual platform from scratch and test the product for quality & stability.
You should have hands-on experience in testing and writing test case to develop and execute exploratory tests in order to ensure product quality. You will have to estimate, plan, and coordinate testing activities. You will also have to ensure that quality issues and defects are appropriately identified, documented, tracked, and resolved in our defect tracking system. You should have strong inter-personal and communication skills.
Responsibilities:
Design, implement & execute the test cases for product features Identify, record & track the bugs or enhancements early in the release life cycle Create detailed, comprehensive and well-structured test plans and test cases Perform thorough regression testing when bugs are resolved and support testing quick patches to production Review requirements specifications and technical design documents to provide timely and meaningful feedback Create repeatability in testing that enables and validates high quality releases Identify functional/non-functional issues and come up with scalable resolutions Engage actively with cross functional teams to enable timely delivery through good planning, proactive communication and timely execution
Requirements:
Bachelor’s degree in Computer Science, Information Technology or a related field 8 to 10 years of relevant experience in the testing domain for SaaS products Experience in automation testing and using testing tools like Selenium, JUnit, jMeter, Appium Expertise in testing distributed and scalable web and mobile applications Hands-on experience in non-functional testing skills like Load Testing, Performance Testing and Security Testing Experience in testing APIs, web and mobile applications Experience in working on Linux/Unix environment Knowledge of DevOps is an added advantage Understanding of Software Development (preferably Java) and Continuous Integration Systems Experience with testing on AWS environments Experience of working in Agile Environment Excellent written and oral communication skills, judgment and decision making skills, and the ability to work under continual deadline pressure
Job Requirements:
• Proven working experience as a business development manager, sales executive or a relevant role
• Proven sales track record
• Experience in customer support is a plus
• Proficiency in MS Office
• Proficiency in English
• Market knowledge
• Communication and negotiation skills
• Ability to build rapport
• Time management and planning skills
• BSc/BA in business administration, sales or relevant field
Job Responsibilities:
• Develop a growth strategy focused both on financial gain and customer satisfaction
• Manage complete client and internal operations
• Conduct research to identify new markets and customer needs
• Arrange business meetings with prospective clients
• Promote the company’s products/services addressing or predicting clients’ objectives
• Prepare sales contracts ensuring adherence to law-established rules and guidelines
• Keep records of sales, revenue, invoices etc.
• Provide trustworthy feedback and after-sales support
• Build long-term relationships with new and existing customers
• Develop entry level staff into valuable salespeople



Data Semantics
We are Product base company and Microsoft Partner
Data Semantics is an award-winning Data Science company with a vision to empower every organization to harness the full potential of its data assets. In order to achieve this, we provide Artificial Intelligence, Big Data and Data Warehousing solutions to enterprises across the globe. Data Semantics was listed as one of the top 20 Analytics companies by Silicon India 2018 & CIO Review India 2014 as one of the Top 20 BI companies. We are headquartered in Bangalore, India with our offices in 6 global locations including USA United Kingdom, Canada, United Arab Emirates (Dubai Abu Dhabi), and Mumbai. Our mission is to enable our people to learn the art of data management and visualization to help our customers make quick and smart decisions.
Our Services include:
Business Intelligence & Visualization
App and Data Modernization
Low Code Application Development
Artificial Intelligence
Internet of Things
Data Warehouse Modernization
Robotic Process Automation
Advanced Analytics
Our Products:
Sirius – World’s most agile conversational AI platform
Serina
Conversational Analytics
Contactless Attendance Management System
Company URL: https://datasemantics.co
JD:
.Net
.Net Core
Logic App
SQL
Regards,
Deepu Vijayan | Talent Acquisition
Data Analyst
Job Description
Summary
Are you passionate about handling large & complex data problems, want to make an impact and have the desire to work on ground-breaking big data technologies? Then we are looking for you.
At Amagi, great ideas have a way of becoming great products, services, and customer experiences very quickly. Bring passion and dedication to your job and there's no telling what you could accomplish. Would you like to work in a fast-paced environment where your technical abilities will be challenged on a day-to-day basis? If so, Amagi’s Data Engineering and Business Intelligence team is looking for passionate, detail-oriented, technical savvy, energetic team members who like to think outside the box.
Amagi’s Data warehouse team deals with petabytes of data catering to a wide variety of real-time, near real-time and batch analytical solutions. These solutions are an integral part of business functions such as Sales/Revenue, Operations, Finance, Marketing and Engineering, enabling critical business decisions. Designing, developing, scaling and running these big data technologies using native technologies of AWS and GCP are a core part of our daily job.
Key Qualifications
- Experience in building highly cost optimised data analytics solutions
- Experience in designing and building dimensional data models to improve accessibility, efficiency and quality of data
- Experience (hands on) in building high quality ETL applications, data pipelines and analytics solutions ensuring data privacy and regulatory compliance.
- Experience in working with AWS or GCP
- Experience with relational and NoSQL databases
- Experience to full stack web development (Preferably Python)
- Expertise with data visualisation systems such as Tableau and Quick Sight
- Proficiency in writing advanced SQL queries with expertise in performance tuning handling large data volumes
- Familiarity with ML/AÍ technologies is a plus
- Demonstrate strong understanding of development processes and agile methodologies
- Strong analytical and communication skills. Should be self-driven, highly motivated and ability to learn quickly
Description
Data Analytics is at the core of our work, and you will have the opportunity to:
- Design Data-warehousing solutions on Amazon S3 with Athena, Redshift, GCP Bigtable etc
- Lead quick prototypes by integrating data from multiple sources
- Do advanced Business Analytics through ad-hoc SQL queries
- Work on Sales Finance reporting solutions using tableau, HTML5, React applications
We build amazing experiences and create depth in knowledge for our internal teams and our leadership. Our team is a friendly bunch of people that help each other grow and have a passion for technology, R&D, modern tools and data science.
Our work relies on deep understanding of the company needs and an ability to go through vast amounts of internal data such as sales, KPIs, forecasts, Inventory etc. One of the key expectations of this role would be to do data analytics, building data lakes, end to end reporting solutions etc. If you have a passion for cost optimised analytics and data engineering and are eager to learn advanced data analytics at a large scale, this might just be the job for you..
Education & Experience
A bachelor’s/master’s degree in Computer Science with 5 to 7 years of experience and previous experience in data engineering is a plus.
- Role/Level: AR Analyst- Process Analyst (PA)/ Sr Process Analyst(SPA)
- Notice period: Immediate joiners preferred (do not submit profiles who’s salary expectation if above 4.6 LPA fixed)
- Experience: 1+ years(for PA)/2+ years (for SPA)
- Location- Bangalore, Bannerghatta Road, Arekere gate & Chennai
- Must have:
Accounts receivable, AR calling, Denial management.
Process: Voice.
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Author data services using a variety of programming languages
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Azure ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and Azure regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Work in an Agile environment with Scrum teams.
- Ensure data quality and help in achieving data governance.
Basic Qualifications
- 2+ years of experience in a Data Engineer role
- Undergraduate degree required (Graduate degree preferred) in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
- Experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases
- Experience with data pipeline and workflow management tools
- Experience with Azure cloud services: ADLS, ADF, ADLA, AAS
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

