11+ NIST 800-53 Jobs in Pune | NIST 800-53 Job openings in Pune
Apply to 11+ NIST 800-53 Jobs in Pune on CutShort.io. Explore the latest NIST 800-53 Job opportunities across top companies like Google, Amazon & Adobe.
What will you do?
Governance and Policy Development
· Develop, implement, and maintain governance policies, SOPs, and related documentation.
· Ensure all policies align with industry standards (e.g., FedRAMP, NIST SP 800-53, ISO 27001 family, and HIPAA).
· Monitor policy effectiveness and recommend updates based on organizational changes or regulatory updates.
Risk Management
· Conduct risk assessments to identify vulnerabilities, threats, and compliance gaps.
· Collaborate with cross-functional teams to design and implement remediation strategies.
· Maintain risk registers and monitor mitigation efforts.
Compliance Oversight
· Support the organization in achieving and maintaining FedRAMP certification.
· Manage periodic audits, security assessments, and readiness activities for compliance frameworks.
· Track and report on compliance metrics, audit findings, and resolution status.
Training and Awareness
· Develop and deliver training programs to enhance employee understanding of compliance policies and procedures.
· Act as a point of contact for compliance-related queries within the organization.
Incident Response and Reporting
· Support incident response processes to ensure effective investigation and reporting of compliance-related incidents.
· Collaborate with stakeholders to implement corrective actions and prevent recurrence.
Vendor and Third-Party Risk Management
· Assess third-party vendors for compliance with organizational policies and standards.
· Ensure contracts include appropriate compliance requirements.
What do you bring to the table?
Education & Experience
· Overall 12- 15 years of relevant experience
· Bachelor's degree in Information Technology, Cybersecurity, Risk Management, or related field (Master’s preferred).
· 3+ years of experience in governance, risk, and compliance roles, with specific experience in FedRAMP compliance.
Knowledge & Skills
· Strong understanding of FedRAMP, NIST SP 800-53, ISO 27001, and other relevant frameworks.
· Experience in drafting policies, procedures, and SOPs.
· Familiarity with GRC tools and platforms (e.g., Archer, ServiceNow GRC).
· Excellent communication and documentation skills.
· Analytical mindset with attention to detail.
Certifications (Preferred)
· Certified Information Systems Security Professional (CISSP)
· Certified Information Systems Auditor (CISA)
· Certified Information Security Manager (CISM)
· ISO 27001 Lead or Internal auditor
* Python (3 to 6 years): Strong expertise in data workflows and automation
* Spark (PySpark): Hands-on experience with large-scale data processing
* Pandas: For detailed data analysis and validation
* Delta Lake: Managing structured and semi-structured datasets at scale
* SQL: Querying and performing operations on Delta tables
* Azure Cloud: Compute and storage services
* Orchestrator: Good experience with either ADF or Airflow
At least 5 years of experience in testing and developing automation tests.
A minimum of 3 years of experience writing tests in Python, with a preference for experience in designing automation frameworks.
Experience in developing automation for big data testing, including data ingestion, data processing, and data migration, is highly desirable.
Familiarity with Playwright or other browser application testing frameworks is a significant advantage.
Proficiency in object-oriented programming and principles is required.
Extensive knowledge of AWS services is essential.
Strong expertise in REST API testing and SQL is required.
A solid understanding of testing and development life cycle methodologies is necessary.
Knowledge of the financial industry and trading systems is a plus
Responsibilities
• Will design & build property and casualty Insurance product using Duck Creek tool • Document technical / solution approach
• Will perform Coding, Unit Testing and Code Review In Duck Creek platform • Coach and help team members for their delivery Technical Expertise
• Must have deep understanding of Duck Creek Claim Administration System configuration.
• Ability to understand business needs and translate them into manuscript inheritance and group structure, including design of technical components.
• Experienced in Manuscript Coding & Configuration using Author, Product Studio, Express, Server and User Admin.
• Hands on experience in Claims Desktop, Console modules, Party Module, Task Creation, Claim Configuring, Extension and Advanced Extension points, Auto Reserves,
• Integration and configuration with Claim related third party
• Experience with SQL Server, customizing and Debugging SQL store, Triggers, User defined Functions / Types, Cursors and XQueries.
• Hands on development experience in Microsoft .NET including C#, ASP.NET and SQL Server
• Must be experienced in DCT debugging Tools (TraceMonitor, ExampleUtil, TSV Monitor and Data Tester).
• Must have XML and XSLT programing experience
• Must have working experience in Duck Creek product versions 4.x , 5.x, 6.x, 7.x Or latest Professional Attributes
• Should have good communication and team building skill
• Good to have knowledge in Insurance
• Ability to work in a team environment which will include but is not limited coaching team, help to fix issues,
• Full knowledge of software development life cycle using Agile methodology
Title:-Salesforce developer with Tibco integration
Location:-Pune
Exp:-4+yrs
CTC:-20-23lpa
KEY RESPONSIBILITIES:
- Develop enhancements to support the implementation of multiple strategic business units
- Analyze production support defects and develop code fixes
- Provide technical guidance to key business stakeholders
- Collaborate with Platform Manager and Technical Architect on technical and solution designs
- Work with other IT teams on integration related issues and enhancements
- Provide guidance to QA team on testing requirements
- Document technical and solution designs
- Adhere to the documented agile process and comply with corporate standards/policies
SPECIFIC KNOWLEDGE & SKILLS:
· Experience with the development of the following components:
o Apex Triggers and Batch classes
o Lightning Web Components
o Flows, Process Builder, and Workflows
o Web Services
- Knowledge and experience of Sales Cloud implementations
- Experience with the development of Tibco Scribe and the creation of data flows
- Extensive knowledge of all admin tasks
Appristine Technologies. We are based in Pune and we are hiring for the following role.
1. PHP Developer – 2+ Years Exp.
Skill:-
- Strong skill level in PHP
- Intermediate skill level in HTML and CSS
- Intermediate skill level in JavaScript or JQuery
- Intermediate skill level in at least one server-side language (i.e. Java, Python, etc)
- Intermediate skill level with relational database systems (i.e. PostgreSQL, MySQL, etc)
- Intermediate skill level with a source control software (i.e. Git, SVN, etc)
- Good understanding of UI/UX design principles
- Strong technical and non-technical communication skills
- Good understanding of object-oriented programming
Role and Responsibilities
- Execute data mining projects, training and deploying models over a typical duration of 2 -12 months.
- The ideal candidate should be able to innovate, analyze the customer requirement, develop a solution in the time box of the project plan, execute and deploy the solution.
- Integrate the data mining projects embedded data mining applications in the FogHorn platform (on Docker or Android).
Core Qualifications
Candidates must meet ALL of the following qualifications:
- Have analyzed, trained and deployed at least three data mining models in the past. If the candidate did not directly deploy their own models, they will have worked with others who have put their models into production. The models should have been validated as robust over at least an initial time period.
- Three years of industry work experience, developing data mining models which were deployed and used.
- Programming experience in Python is core using data mining related libraries like Scikit-Learn. Other relevant Python mining libraries include NumPy, SciPy and Pandas.
- Data mining algorithm experience in at least 3 algorithms across: prediction (statistical regression, neural nets, deep learning, decision trees, SVM, ensembles), clustering (k-means, DBSCAN or other) or Bayesian networks
Bonus Qualifications
Any of the following extra qualifications will make a candidate more competitive:
- Soft Skills
- Sets expectations, develops project plans and meets expectations.
- Experience adapting technical dialogue to the right level for the audience (i.e. executives) or specific jargon for a given vertical market and job function.
- Technical skills
- Commonly, candidates have a MS or Ph.D. in Computer Science, Math, Statistics or an engineering technical discipline. BS candidates with experience are considered.
- Have managed past models in production over their full life cycle until model replacement is needed. Have developed automated model refreshing on newer data. Have developed frameworks for model automation as a prototype for product.
- Training or experience in Deep Learning, such as TensorFlow, Keras, convolutional neural networks (CNN) or Long Short Term Memory (LSTM) neural network architectures. If you don’t have deep learning experience, we will train you on the job.
- Shrinking deep learning models, optimizing to speed up execution time of scoring or inference.
- OpenCV or other image processing tools or libraries
- Cloud computing: Google Cloud, Amazon AWS or Microsoft Azure. We have integration with Google Cloud and are working on other integrations.
- Decision trees like XGBoost or Random Forests is helpful.
- Complex Event Processing (CEP) or other streaming data as a data source for data mining analysis
- Time series algorithms from ARIMA to LSTM to Digital Signal Processing (DSP).
- Bayesian Networks (BN), a.k.a. Bayesian Belief Networks (BBN) or Graphical Belief Networks (GBN)
- Experience with PMML is of interest (see www.DMG.org).
- Vertical experience in Industrial Internet of Things (IoT) applications:
- Energy: Oil and Gas, Wind Turbines
- Manufacturing: Motors, chemical processes, tools, automotive
- Smart Cities: Elevators, cameras on population or cars, power grid
- Transportation: Cars, truck fleets, trains
About FogHorn Systems
FogHorn is a leading developer of “edge intelligence” software for industrial and commercial IoT application solutions. FogHorn’s Lightning software platform brings the power of advanced analytics and machine learning to the on-premise edge environment enabling a new class of applications for advanced monitoring and diagnostics, machine performance optimization, proactive maintenance and operational intelligence use cases. FogHorn’s technology is ideally suited for OEMs, systems integrators and end customers in manufacturing, power and water, oil and gas, renewable energy, mining, transportation, healthcare, retail, as well as Smart Grid, Smart City, Smart Building and connected vehicle applications.
Press: https://www.foghorn.io/press-room/">https://www.foghorn.io/press-room/
Awards: https://www.foghorn.io/awards-and-recognition/">https://www.foghorn.io/awards-and-recognition/
- 2019 Edge Computing Company of the Year – Compass Intelligence
- 2019 Internet of Things 50: 10 Coolest Industrial IoT Companies – CRN
- 2018 IoT Planforms Leadership Award & Edge Computing Excellence – IoT Evolution World Magazine
- 2018 10 Hot IoT Startups to Watch – Network World. (Gartner estimated 20 billion connected things in use worldwide by 2020)
- 2018 Winner in Artificial Intelligence and Machine Learning – Globe Awards
- 2018 Ten Edge Computing Vendors to Watch – ZDNet & 451 Research
- 2018 The 10 Most Innovative AI Solution Providers – Insights Success
- 2018 The AI 100 – CB Insights
- 2017 Cool Vendor in IoT Edge Computing – Gartner
- 2017 20 Most Promising AI Service Providers – CIO Review
Our Series A round was for $15 million. Our Series B round was for $30 million October 2017. Investors include: Saudi Aramco Energy Ventures, Intel Capital, GE, Dell, Bosch, Honeywell and The Hive.
About the Data Science Solutions team
In 2018, our Data Science Solutions team grew from 4 to 9. We are growing again from 11. We work on revenue generating projects for clients, such as predictive maintenance, time to failure, manufacturing defects. About half of our projects have been related to vision recognition or deep learning. We are not only working on consulting projects but developing vertical solution applications that run on our Lightning platform, with embedded data mining.
Our data scientists like our team because:
- We care about “best practices”
- Have a direct impact on the company’s revenue
- Give or receive mentoring as part of the collaborative process
- Questions and challenging the status quo with data is safe
- Intellectual curiosity balanced with humility
- Present papers or projects in our “Thought Leadership” meeting series, to support continuous learning


