About the Organization
Real Estate Syndicators leverage SyndicationPro to manage billions in real estate assets and thousands of investors. Growing at 17% MoM, SyndicationPro is #1 Platform To Automate Real Estate Fund Raising, Investor Relations, & Close More Deals!
What makes SyndicationPro unique is that it is cash flow positive while maintaining a healthy growth rate and is backed by seasoned investors and real estate magnates. We are also a FirstPrinciples.io Venture Studio Company, giving us the product engineering, growth, and operational backbone in India to scale our team.
Our story has been covered by Bloomberg Business, Yahoo Finance, and several other renowned news outlets.
Why this Role
We are scaling our Customer Support & Success organization and you will have the opportunity to see how the systems and processes are built for an SaaS company that is rapidly scaling. Our Customer Support & Success Team is led by expierened SaaS Operators who can provide you the guidance and mentorship to growth quickly. You will also have the oppurtunity for faster than normal promotion cycles.
Roles and Responsibilities
- Engage with our clients primarily through email, phone, or our ticketing system
- Understand client needs and tackle problems
- Think creatively about technical issues and find ways to change showstoppers into positive ways forward
- Work closely with internal teams and external partners to deep dive on a customer request or Issue to provider resolution within SLA.
- Manage workflow using our systems to ensure nothing falls to the bottom of the to-do list
- Keep current on product releases and updates
- Share the knowledge among your fellow trouble-shooters and create knowledge articles when necessary
- Support Customers working with Migration teams on Data Migrations to our Platform.
- Work on migrating sensitive customer data from a third-party investment platform or Excel to SyndicationPro with minimal supervision.
- Understand the client’s needs and set up the SyndicationPro platform to meet customer expectations.
- A passion for solving problems.
- A way with words that makes technical topics easy to understand
- A keen eye for getting to the root cause of an issue
- A love of helping people
- An ability to manage multiple needs and keep them all on track
- A strong ethos towards helping the team win
- Prior experience in SaaS environments
- Experience with MS Office and data programs
- Attention to detail
- Confidentiality
-
Experience in writing professional Emails
-
Experience in handling a high volume of customer queries
-
Must have 1-2 years of prior experience in SAAS environments.
-
Must be comfortable working during night shifts.

About SyndicationPro
About
What makes SyndicationPro unique is that it is cash flow positive while maintaining a healthy growth rate and is backed by seasoned investors and real estate magnates. We are also part of, FirstPrinciples.io, a tech-holding company that starts, acquires, and advises B2B SaaS Companies. FirstPrinciples.io and SyndicationPro.com have a combined strength of over 80+ people across India, U.S, Spain, and Germany.
SyndicationPro.com has been featured by https://in.finance.yahoo.com/news/syndicationpro-establishes-hq-lehi-utah-130600371.html?guccounter=1">Yahoo Finance and https://www.benzinga.com/business/best-crms-for-real-estate-investors/">Benzinga, among other top publications! We also were Finalists in Industry Leading Awards such as https://informaconnect.com/finovate-industry-awards/awards-categories/">Finnovate and https://www.cloud-awards.com/2021-software-awards-shortlist/">2021 SaaS Awards.
Photos
Connect with the team
Similar jobs
🚨 Hiring Alert 🚨
We are Hiring Java Backend Intern for 2 months !
Skills Required:
1. Good understanding of Java 17, Spring and any Sql database
2. Good Understanding on designing low level code from scratch
3. Experience in building database schema and code architecture
4. Familiar with design patterns and willingness to write clean, readable, and well-documented code.
5. Familiarity with tools like postman, STS or intelij
6. Understanding of REST APIs and their role in application development.
7. Good DSA and problem solving skills
Roles & Responsibilities:
1. Assist in developing and maintaining web applications.
2. Learn to utilize open source tools for integration
3. Collaborate with team members to design and implement new features.
4. Contribute to optimizing application performance and resolving bugs.
5. Stay curious and keep learning new technologies relevant to spring boot and spring reactive
6. Exposure to version control systems like Git.
7. Passion for learning and contributing to real-world projects.
Preferred Qualifications:
1. Min exp of 0-2 years.
2. Skills in computer science/IT and relevant.
What You’ll Gain:
1. Stipend - 8k -10k/ month, subjective to your performance
2. Hands-on experience in building production-grade applications.
Marketing Specialist
Company Summary :
As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference. At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com
Business Summary :
The Deltek Corporate Development team plays a vital role in spearheading initiatives that drive the company's go-to-market strategies. We help foster business growth, expansion and maturity by creating and delivering value to internal stakeholders and our growing partner and customer ecosystems. As part of our dynamic team, you'll collaborate with product, marketing, sales and partner alliance organizations to challenge conventions and deliver exceptional, high-quality product solutions.
External Job Title :
Competitive Intelligence Analyst
Position Responsibilities :
Deltek is seeking a detail-oriented and curious Competitive Intelligence Analyst to join our team. In this role, you will be responsible for gathering, analyzing, and disseminating actionable intelligence on competitors and market trends impacting GovWin IQ, the premier market intelligence platform for state, local, federal and Canadian govcons. Your insights will inform strategic decision-making and help position GovWin IQ ahead in the marketplace.
- Conduct comprehensive research on GovWin IQ competitors, including their products, services, pricing, and market strategies.
- Monitor industry trends, market developments, and regulatory changes that may impact GovWin IQ’s competitive positioning.
- Analyze data from various sources to identify opportunities and threats in GovWin IQ’s market landscape.
- Develop and maintain a repository of competitive intelligence information and ensure its accessibility to relevant stakeholders.
- Collaborate with cross-functional teams, including Product Management, Marketing, Customer Success and Sales, to provide insights that support strategic initiatives.
- Prepare and present detailed reports and presentations to leadership and cross-functional teams, highlighting key findings and recommendations.
Qualifications :
- Bachelor's degree in Business, Marketing, Economics, Government/Public Affairs or a related field;
- Minimum of 3 years of experience in competitive intelligence, market research, or a related analytical role.
- Strong analytical skills with the ability to interpret complex data and translate it into actionable insights.
- Experience in various structured analytic techniques, like SWOT, ACH, MDCM, etc.
- Excellent written and verbal communication skills, with the ability to present information clearly and concisely.
- Proficiency in using competitive intelligence tools and platforms; experience using competing MI solutions (GovTribe, BGov, Govly, GovSpend, DACIS, others) a plus.
- Demonstrated ability to work independently and collaboratively in a fast-paced environment.
- High level of integrity and ethical standards in handling sensitive information.

The Sr AWS/Azure/GCP Databricks Data Engineer at Koantek will use comprehensive
modern data engineering techniques and methods with Advanced Analytics to support
business decisions for our clients. Your goal is to support the use of data-driven insights
to help our clients achieve business outcomes and objectives. You can collect, aggregate, and analyze structured/unstructured data from multiple internal and external sources and
patterns, insights, and trends to decision-makers. You will help design and build data
pipelines, data streams, reporting tools, information dashboards, data service APIs, data
generators, and other end-user information portals and insight tools. You will be a critical
part of the data supply chain, ensuring that stakeholders can access and manipulate data
for routine and ad hoc analysis to drive business outcomes using Advanced Analytics. You are expected to function as a productive member of a team, working and
communicating proactively with engineering peers, technical lead, project managers, product owners, and resource managers. Requirements:
Strong experience as an AWS/Azure/GCP Data Engineer and must have
AWS/Azure/GCP Databricks experience. Expert proficiency in Spark Scala, Python, and spark
Must have data migration experience from on-prem to cloud
Hands-on experience in Kinesis to process & analyze Stream Data, Event/IoT Hubs, and Cosmos
In depth understanding of Azure/AWS/GCP cloud and Data lake and Analytics
solutions on Azure. Expert level hands-on development Design and Develop applications on Databricks. Extensive hands-on experience implementing data migration and data processing
using AWS/Azure/GCP services
In depth understanding of Spark Architecture including Spark Streaming, Spark Core, Spark SQL, Data Frames, RDD caching, Spark MLib
Hands-on experience with the Technology stack available in the industry for data
management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc
Hands-on knowledge of data frameworks, data lakes and open-source projects such
asApache Spark, MLflow, and Delta Lake
Good working knowledge of code versioning tools [such as Git, Bitbucket or SVN]
Hands-on experience in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair
Experience preparing data for Data Science and Machine Learning with exposure to- model selection, model lifecycle, hyperparameter tuning, model serving, deep
learning, etc
Demonstrated experience preparing data, automating and building data pipelines for
AI Use Cases (text, voice, image, IoT data etc. ). Good to have programming language experience with. NET or Spark/Scala
Experience in creating tables, partitioning, bucketing, loading and aggregating data
using Spark Scala, Spark SQL/PySpark
Knowledge of AWS/Azure/GCP DevOps processes like CI/CD as well as Agile tools
and processes including Git, Jenkins, Jira, and Confluence
Working experience with Visual Studio, PowerShell Scripting, and ARM templates. Able to build ingestion to ADLS and enable BI layer for Analytics
Strong understanding of Data Modeling and defining conceptual logical and physical
data models. Big Data/analytics/information analysis/database management in the cloud
IoT/event-driven/microservices in the cloud- Experience with private and public cloud
architectures, pros/cons, and migration considerations. Ability to remain up to date with industry standards and technological advancements
that will enhance data quality and reliability to advance strategic initiatives
Working knowledge of RESTful APIs, OAuth2 authorization framework and security
best practices for API Gateways
Guide customers in transforming big data projects, including development and
deployment of big data and AI applications
Guide customers on Data engineering best practices, provide proof of concept, architect solutions and collaborate when needed
2+ years of hands-on experience designing and implementing multi-tenant solutions
using AWS/Azure/GCP Databricks for data governance, data pipelines for near real-
time data warehouse, and machine learning solutions. Over all 5+ years' experience in a software development, data engineering, or data
analytics field using Python, PySpark, Scala, Spark, Java, or equivalent technologies. hands-on expertise in Apache SparkTM (Scala or Python)
3+ years of experience working in query tuning, performance tuning, troubleshooting, and debugging Spark and other big data solutions. Bachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience
Ability to manage competing priorities in a fast-paced environment
Ability to resolve issues
Basic experience with or knowledge of agile methodologies
AWS Certified: Solutions Architect Professional
Databricks Certified Associate Developer for Apache Spark
Microsoft Certified: Azure Data Engineer Associate
GCP Certified: Professional Google Cloud Certified
Key Areas of Responsibility
- Participate in project discovery/analysis, design, and deployment planning.
- Document business requirements and functional specifications using Microsoft’s Surestep methodology (GAP Fit analysis and Functional Requirements Documentation).
- Analyze client business processes and recommend ways to improve or re-engineer for optimum performance.
- Implement and deploy Dynamics 365 Business Central software (design, configure, train, support).
- Understand user and system requirements and identify specific enhancement customizations if necessary.
- Provide data migration and conversion services.
- Deliver application training in online and in a one-on-one environment.
- Provide basic technical support for project life cycle as well as complex support issues.
- Manage client expectations, the project team (internal and external) all while providing superior customer service.
- Work with application developers during design, development, and testing phases.
- Draft end user documentation.
- Assist in presales demos and related activities
Required Candidate Profile:
- Should have worked in multiple versions of Dynamics NAV and D365 Business Central for international clients.
- Should have completed at least 6-7 full life cycle implementation (Full Life Cycle = General Ledger, Accounts Receivable, Accounts Payable, Fixed Assets, Budget, Cash flow, SCM, Inventory, Decret Manufacturing).
- Background or strong career interest in finance or business analysis.
- Possess a strong, professional work ethic.
- Excels at team collaboration.
- Demonstrated problem-solving and decision-making skills.
- Strives to provide exceptional customer service and high client satisfaction.
- Strong verbal, written and organizational skills.
- Must be fluent in English – both verbal and written

WHAT YOU WILL DO:
-
● Create and maintain optimal data pipeline architecture.
-
● Assemble large, complex data sets that meet functional / non-functional business requirements.
-
● Identify, design, and implement internal process improvements: automating manual processes,
optimizing data delivery, re-designing infrastructure for greater scalability, etc.
-
● Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide
variety of data sources using Spark,Hadoop and AWS 'big data' technologies.(EC2, EMR, S3, Athena).
-
● Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition,
operational efficiency and other key business performance metrics.
-
● Work with stakeholders including the Executive, Product, Data and Design teams to assist with
data-related technical issues and support their data infrastructure needs.
-
● Keep our data separated and secure across national boundaries through multiple data centers and AWS
regions.
-
● Create data tools for analytics and data scientist team members that assist them in building and
optimizing our product into an innovative industry leader.
-
● Work with data and analytics experts to strive for greater functionality in our data systems.
REQUIRED SKILLS & QUALIFICATIONS:
-
● 5+ years of experience in a Data Engineer role.
-
● Advanced working SQL knowledge and experience working with relational databases, query authoring
(SQL) as well as working familiarity with a variety of databases.
-
● Experience building and optimizing 'big data' data pipelines, architectures and data sets.
-
● Experience performing root cause analysis on internal and external data and processes to answer
specific business questions and identify opportunities for improvement.
-
● Strong analytic skills related to working with unstructured datasets.
-
● Build processes supporting data transformation, data structures, metadata, dependency and workload
management.
-
● A successful history of manipulating, processing and extracting value from large disconnected datasets.
-
● Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores.
-
● Strong project management and organizational skills.
-
● Experience supporting and working with cross-functional teams in a dynamic environment
-
● Experience with big data tools: Hadoop, Spark, Pig, Vetica, etc.
-
● Experience with AWS cloud services: EC2, EMR, S3, Athena
-
● Experience with Linux
-
● Experience with object-oriented/object function scripting languages: Python, Java, Shell, Scala, etc.
PREFERRED SKILLS & QUALIFICATIONS:
● Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
- Works independently without any supervision
- Work on continuous improvement of the products through innovation and learning. Someone with a knack for benchmarking and optimization
- Experience in deploying highly complex, distributed transaction processing systems.
- Stay abreast with new innovations and the latest technology trends and explore ways of leveraging these for improving the product in alignment with the business.
- As a component owner, where the component impacts across multiple platforms (5-10-member team), work with customers to obtain their requirements and deliver the end-to-end project.
Required Experience, Skills, and Qualifications
- 5+ years of experience as a DevOps Engineer. Experience with the Golang cycle is a plus
- At least one End to End CI/CD Implementation experience
- Excellent Problem Solving and Debugging skills in DevOps area· Good understanding of Containerization (Docker/Kubernetes)
- Hands-on Build/Package tool experience· Experience with AWS services Glue, Athena, Lambda, EC2, RDS, EKS/ECS, ALB, VPC, SSM, Route 53
- Experience with setting up CI/CD pipeline for Glue jobs, Athena, Lambda functions
- Experience architecting interaction with services and application deployments on AWS
- Experience with Groovy and writing Jenkinsfile
- Experience with repository management, code scanning/linting, secure scanning tools
- Experience with deployments and application configuration on Kubernetes
- Experience with microservice orchestration tools (e.g. Kubernetes, Openshift, HashiCorp Nomad)
- Experience with time-series and document databases (e.g. Elasticsearch, InfluxDB, Prometheus)
- Experience with message buses (e.g. Apache Kafka, NATS)
- Experience with key-value stores and service discovery mechanisms (e.g. Redis, HashiCorp Consul, etc)
- Quickly understanding the context of a student's preparation, assessing his/her learning patterns and pitching relevant products and making a sale.
- Building an empathetic connection with the student community at large and understanding their motivations levers
- Making cold/ filtered leads calls and establishing product relevance and ultimately the sale
- Maintaining knowledge of all product and service offerings of the company
- Understanding the core differentiators among the competitors in the same industry and informing & incorporating the management & the users about them to increase sales volume


● Working hand in hand with application developers and data scientists to help build softwares that scales in terms of performance and stability Skills ● 3+ years of experience managing large scale data infrastructure and building data pipelines/ data products. ● Proficient in - Any data engineering technologies and proficient in AWS data engineering technologies is plus. ● Language - python, scala or go ● Experience in working with real time streaming systems Experience in handling millions of events per day Experience in developing and deploying data models on Cloud ● Bachelors/Masters in Computer Science or equivalent experience Ability to learn and use skills in new technologies


