fbpx

Software Engineer

Apply
  • Company: PostProcess
  • Posted: 02/09/2022
  • Location: Buffalo, NY
  • Category: Software Engineering
  • Job Type: Direct Placement

Description


Why Join the PostProcess Team

If you’re looking to join the hottest tech company in the fast-growing 3D printing / Additive Manufacturing (AM) market, this is the role for you. PostProcess is the only intelligent, and automated post-printing solution for 3D printed parts and has already raised more than $38M in funding to accelerate its already impressive growth. In this Software Engineering role, the ideal candidate will have a strong background in a heterogeneous data management environment and a passion for building software that collects, aggregates and analyzes data at scale. You will be expected to have extensive experience with data warehousing best practices and techniques and possess extensive knowledge, hands-on experience, and expertise with medium to large enterprise data warehousing environments. You will work within our software engineering, product management, and development engineering teams to shape our data strategy at PostProcess.

The successful candidate will have the ability to think strategically about our business and our customers’ success while working in collaboration across all departments and all teams to deliver exceptional customer satisfaction to our growing list of global customers. Customer focus and hands-on execution are hallmarks of a successful PostProcess teammate.

Duties and Responsibilities

The essential functions of the position include, but are not limited to, the following:

  • Anticipate and plan for the growth of PostProcess data platform infrastructure
  • Scope, design and implement platform solutions that make the appropriate tradeoffs between resiliency, durability, and performance
  • Help debug and solve critical infrastructure issues across our services and multiple levels of the stack
  • Participate in all phases of the Software Development Life cycle (SDLC)
  • Adhere to high-quality development principles while delivering projects with technical excellence
  • Identify bottlenecks and bugs, and devise solutions to mitigate and address these issues
  • Participate in peer-reviews of solution designs and related Package and support deployment of releases
  • Maintain high standards of software quality within the team by establishing good practices and habits
  • Develop documentation throughout the SDLC
  • Translate requirements into high-quality, testable, scalable software
  • Design, develop, and unit test applications in accordance with established standards
  • Assess opportunities for application and process improvement
  • Develop high-quality state-of-the-art algorithms
  • Agile/Scrum mindset in the software domain

Skills and Qualifications

  • In-depth understanding of data management (e.g., permissions, recovery, security and monitoring)
  • Strong understanding and practical experience in data platform fundamentals, including clustering, distributed systems, fault tolerance, networking, etc
  • Strong understanding and practical experience with systems like Hadoop, Spark, Presto, Iceberg, and Airflow
  • Experience planning and driving large projects involving multiple stakeholders across an organization
  • Involvement with high scalability projects involving cloud-based infrastructure design and implementation
  • Experience with Microsoft development tools and services (Visual Studio and Azure DevOps is highly desired)
  • Experience with document-oriented databases and non-relational databases
  • Strong relational database experience and SQL/stored procedure skillsStrong written and verbal communication
  • Demonstrated ability to work cross-functionally to meet program requirements
  • Excellent analytical and problem-solving skills with the ability to think quickly and offer alternatives
  • Organized, goal-oriented, motivated self-starter who can work well in a team environment
  • Working knowledge of agile software development life-cycle
  • Enhanced statistical and data analytics along with strong knowledge of basic data structures and algorithms is highly desired but not necessary

Education and Experience

  • S. or M.S. in Computer Science or equivalent
  • 3-5+ years of experience working with Data warehouses or Databases
  • 3+ years strong hands-on experience using Cloud tools: Azure Databricks, Azure Delta lake, Azure Data Factory, SQL Server, Azure Synapse, Azure DevOps, Data Lake(ADLS), PySpark, Databricks Notebooks
  • 3+ years of experience in big data technologies like Presto/Trino, Spark, Hadoop, Airflow, Kafka, Apache Flink, Druid, Apache Pinot required
  • Familiarity with Net, TensorFlow or other similar deep net or machine learning packages
  • Experience with IoT communication protocols, web security, and software product lifecycle
  • Experience in an early stage, high growth company is beneficial
  • Knowledge of Traditional or Additive Manufacturing is helpful

Desired Attributes

  • Strong Work Ethic
  • Optimistic Problem Solver – open to new ideas and ways of thinking
  • Team Player – willing to roll up the sleeves and be a utility infielder as required
  • Accountable & Dependable – consistently “Does What They Say They’re Going To Do.”
  • Resourceful – consistently gets the important things done with limited direction
  • Self-motivated
  • Strong Attention to Detail
  • Trustworthy & Honest
  • High Sense of Urgency
  • Well Organized
  • Quick Learner

Join the Forge Buffalo Community

SIGN UP
/* */