Stephen Sequenzia

A Machine Learning Engineer, Data Engineer, Python Software Engineer and Data Scientist who advances data, analytics and statistical maturity within an organization with a portfolio of innovative and unique frameworks. Possesses an ideal foundation for data science: an understanding of the business perspective, statistics for analytical thinking, knowledge of the data lifecycle and development experience to turn ideas into tangible solutions. Assumes a leadership role in working with senior management to establish a data science architecture to support robust analytic capabilities.

Stephen Sequenzia

What I Do

Data Science

A Data Scientist is a data expert that extrapolates insights from large data sets to help organizations solve complex problems. To do so, Data Scientists combine computer science, mathematics, statistics, and modeling with a strong understanding of their business and industry to unlock new opportunities and strategies.

Machine Learning

A Machine Learning Engineer focuses on researching, building and designing self-running Artificial Intelligence systems to automate predictive models. Machine Learning Engineers design and create the AI algorithms capable of learning and making predictions that define machine learning.

Data Engineering

A Data Engineer focuses on preparing data for analytical or operational uses. These engineers are typically responsible for building data pipelines to bring together information from different source systems. They integrate, consolidate and cleanse data and structure it for use in analytics applications. They aim to make data easily accessible and to optimize their organization's big data ecosystem.

Python Development

A Python Developer is responsible for coding, designing, deploying and debugging development projects, typically on the server-side. They often work in close collaboration with data collection and analytics to create useful answers to questions and provide valuable insight.


Feb 2016 - Present
Applied Theta Inc

Owner | Data Scientist | Machine Learning Engineer | Python Developer

Create analytic opportunities with a portfolio of frameworks and models designed to accelerate development and ensure integrity of applied Artificial Intelligence/Machine Learning models meeting domain-specific needs of financial markets. Although initially developed for internal use, many of the models and frameworks are now open source.
  • Developed the entire technology strategy, roadmap and solutions using agile software development.
  • Developed several python-based frameworks and libraries.
  • Utilized Machine Learning algorithms to produce financial models and algorithmic trading systems.
  • Built advanced pipelines to handle data from various sources and time resolutions.
  • Standardized codebases to utilize leading data science frameworks like TensorFlow, Keras, NumPy and Pandas.
  • Provided Machine Learning based consulting, design and development to other companies.
Mar 2010 - Aug 2020
Zia Technology Group Inc

Owner | Data Engineer | Python Developer | IT Consultant

Collaborated with senior and executive leadership on defining and implementing technology strategies and roadmaps to resolve a variety of data and information management challenges. Worked as a consultant, architect and lead developer on the solution design and development, utilizing the various technologies and agile software development.
  • Designed and developed complex data integrations across multiple business applications.
  • Deployed many SQL/No SQL databases and designed custom data schemas.
  • Full-stack development of several custom web-based applications and reporting systems.
  • Designed and implemented fully redundant and highly available virtual infrastructures both fully cloud-based and hybrid co-location/cloud.
Jan 2007 - Mar 2010
Thinksys Incorporated

Owner | Developer | DBA | IT Consultant

Worked directly with senior leadership—company owners and CFOs—to understand the business needs and deliver the right technology solutions and robust IT data infrastructures. Acted as a consultant, DBA and lead developer on several robust business critical projects.
  • Setup, deployed and managed many MS SQL Servers and databases for both custom and 3rd party applications.
  • Designed and developed complex data integrations across multiple business applications.
  • Developed and deployed web-based interfaces for access to back-end data sources.
  • Managed a small team of network engineers and developers.
Dec 2001 - Jan 2007
Biziteks Incorporated

Owner | Network Engineer | Developer | DBA

Provided full-stack IT consulting, support and development for small to medium-sized companies various in industries.
  • Managed team of 15 technicians and network engineers.
  • Setup, deployed and managed many MS SQL Servers and databases for both custom and 3rd party applications.
  • Setup, managed and supported hundreds of client networks and applications.
  • Developed web-based applications and dynamic websites.


Photon Machine Learning Framework

A Machine Learning framework that extends the functionality of other frameworks such as TensorFlow and Keras. Photon ML is built to apply neural network and ensemble modeling techniques for deep learning financial algorithms. The framework supports the entire lifecycle of a Machine Learning project including data preparation, model development, training, monitoring, evaluation and deployment.


Project Repo:


Technologies & Methods: Python, Neural Networks, Deep Learning, Distributed Training

Frameworks & Libraries: TensorFlow, Keras, NumPy, Pandas, Apache Arrow, Apache Parquet, Scikit-learn


Key Features of Photon ML:

  • Custom object-oriented API with built-in subclassing of Keras and TensorFlow APIs.
  • Built-in custom modules such as models, layers, optimizers and loss functions.
  • Highly customizable interface to extend built-in modules for specific algorithms/networks.
  • Detailed logging and analysis of model parameters to increase interpretability and optimization.
  • Works natively with TensorFlow distributed strategies.
  • Real-time data preprocessing; dataset splitting, normalization, scaling, aggregation and time series resampling.
  • Custom batching, padding and masking of data.
  • Designed to be model/algorithm agnostic and to work natively with container services.
  • Natively shares input and output between multiple networks to streamline deep ensemble learning.
  • Simple interface for saving, serializing and loading entire networks including learned and hyper parameters.
  • Custom dynamic learning rate scheduling.

Modeling Research & Development

An evolving collection of data models and algorithms used to model financial markets including equities, options, futures and crypto assets. The project has grown from only traditional statistical modeling to Machine Learning based modeling which includes deep neural networks and probabilistic reasoning networks.


Project Repo:


Technologies & Methods: Deep Learning, Neural Networks, Regression, Classification, Probabilistic Reasoning, Deep Ensemble Learning, Back-Propagation, Statistical Analysis, Financial Modeling


Frameworks & Libraries: TensorFlow, Keras, Pandas, NumPy, Seaborn, Matplotlib, Statsmodels, Scikit-Learn


Algorithms & Models: CNNs, RNNs, Transformers, Attention Mechanisms, Temporal Conv Networks, Autoencoders, Bayesian Neural Networks, ARIMA, Structural Time Series

  • Performed predictions and quantitative analysis of asset prices with a focus on making predictive inferences on time series data.
  • Utilized advanced generalization and regularization techniques to increase the performance of models outside training data sets.
  • Developed custom optimizers and loss functions to address the unique complexities of time series market data.
  • Conducted extensive exploratory data analysis (EDA), feature engineering, data scaling/normalization, resampling, and principal component analysis (PCA).

Machine Learning Preprocessing & Pipelines

Developed a set of custom libraries to handle the unique characteristics of acquisition, preparation and storage of financial market data. These libraries include WebSockets/RESTful API data connectors to access data, detect anomalies in hundreds of millions of data points, then cleanse and pre-process to provide high-integrity data modeling. Also included is a set of tools for domain specific feature engineering and labeling of financial market data.


Project Repo:


Technologies & Methods: Data Preprocessing and Cleaning, Feature Engineering, Machine Learning Pipelines, Data Labeling, WebSockets/RESTful APIs, Nvidia CUDA GPUs, Python Data Structures and Storage


Frameworks: Pandas, NumPy, Numba, Rapids AI, DASK, Apache Arrow, Apache Parquet, Scikit-learn, Seaborn, Matplotlib

  • Utilized domain knowledge to engineer features based on technical indicators and statistical markers.
  • Processed over 20 years of both trading and book market data with multiple time resolutions.
  • Increased the speed and efficiency of data storage and retrieval, utilizing Apache Arrow for in-memory columnar storage and Apache Parquet for persistent on-disk storage.
  • Produced a wealth of useful tools for managing the flow of data for various modeling and algorithmic trading, orchestrating Machine Learning pipelines, providing a high level of fluidity to feature extraction and helping to increase the integrity of the data.
  • Designed custom procedures to label time series data with algorithmic trading attributes such as stop losses, long/short price targets, ATR and VWAP values.
  • Processed and stored data from high-volume real-time market data APIs.
  • Automated data labeling, binning, one-hot encoding, scaling and normalization of time series market data.

Algorithmic Trading System

Developed set of tools and libraries designed to interface with market brokers and execute algorithmic trading. The algorithmic trading systems executes both buy and sell orders in equities, options, futures, and crypto markets. The tools also support back testing of trading algorithms, applying predictive models to historical data to evaluate their accuracy.


Technologies & Methods: Python, C++, Algorithmic Trading, Financial Models


Frameworks: NumPy, SciPy, Pandas, Numba, Seaborn, Matplotlib, Statsmodels

  • Developed utilizing object-oriented programming in Python and C++ for increased performance.
  • Designed a set of libraries that work with both rules-based and Machine Learning based models and algorithms.
  • These same libraries can be used for both live execution of algorithmic trading systems and back testing.
  • Produced a highly functional and efficient algorithmic trading and back testing system; able to trade various assets in various markets and provide increased exposure to key trading metrics.


Orlando, Florida


How Can I Help You?