Dan "Animal" Javorsek, PhD
Harriman, Tennessee, United States
6K followers
500 connections
About
I am fortunate to continue leading in our defense research enterprise as the Chief…
Activity
-
I’m happy to share that I’m starting a new position as Vice President of Engineering and Technology Development at Tactical Air Support!
I’m happy to share that I’m starting a new position as Vice President of Engineering and Technology Development at Tactical Air Support!
Liked by Dan "Animal" Javorsek, PhD
-
Excited to see the WAM-V in action at the Maritime RobotX Challenge currently happening in Sarasota, FL! The international competition, held by…
Excited to see the WAM-V in action at the Maritime RobotX Challenge currently happening in Sarasota, FL! The international competition, held by…
Liked by Dan "Animal" Javorsek, PhD
-
Amazed to see how far autonomous systems have come in just a few years, not just VISTA, but across the entire DoD. Love watching lots of great…
Amazed to see how far autonomous systems have come in just a few years, not just VISTA, but across the entire DoD. Love watching lots of great…
Liked by Dan "Animal" Javorsek, PhD
Experience
Education
-
Purdue University
-
Thesis Title "Exploring Physics Beyond the Standard Model: Astrophysical Motivations and Accelerator Applications"
-
-
Thesis Projects: Automated Security Classification Lifecycle Management Blueprint and Neutrino-Induced Radionuclide Detection System (NIRDS)
-
-
Thesis Title "Characterization of Conventional High Explosive Radio Frequency Signatures"
-
-
Test Management Project Title "Transmission of Radiation for Optimum Noise"
-
-
Publications
-
Cockpit Radiation and the Human Weapon System
66th Annual Symposium of the Society of Experimental Test Pilots (SETP)
The cockpit electromagnetic environment is not good for humans. Evidence is mounting that it contributes to a very broad range of negative physiological effects; from relatively innocuous outcomes such as offspring gender asymmetries favoring daughters, to increased rates of aircrew cancer, spatial disorientation, and other forms of cognitive impairment leading to our current mishap plateau. After a brief review of radiation sources and their interaction mechanisms with human physiology, I…
The cockpit electromagnetic environment is not good for humans. Evidence is mounting that it contributes to a very broad range of negative physiological effects; from relatively innocuous outcomes such as offspring gender asymmetries favoring daughters, to increased rates of aircrew cancer, spatial disorientation, and other forms of cognitive impairment leading to our current mishap plateau. After a brief review of radiation sources and their interaction mechanisms with human physiology, I summarize the efforts currently underway to help better characterize cockpit hostility to our most precious resource, the human weapon system. Our journey will take us from tautomerization of DNA and human magnetoreception, to the Havana Syndrome and the rising impact of Radio Frequency (RF) emissions on cognition. Finally, I will distill this diverse and fascinating research environment into meaningful advice for aviators complete with recommendations that will significantly improve their health and welfare.
-
Dynamic Explanation of Bayesian Networks with Abductive Bayes Factor Qualitative Propagation and Entropy-Based Qualitative Explanation
IEEE 24th International Conference on Information Fusion (FUSION), pp. 1-9
The success of Artificial Intelligence (AI) systems as decision aids is often largely contingent on the ability to trust their recommendations. This trust is greatly enhanced when the AI systems are able to provide explanations to justify the presented results which when implemented properly, also serve as a means to better understand unfamiliar domains. Unfortunately, underlying models in such systems can be often non-intuitive for humans and thus hard to interpret and explore. Explainable AI…
The success of Artificial Intelligence (AI) systems as decision aids is often largely contingent on the ability to trust their recommendations. This trust is greatly enhanced when the AI systems are able to provide explanations to justify the presented results which when implemented properly, also serve as a means to better understand unfamiliar domains. Unfortunately, underlying models in such systems can be often non-intuitive for humans and thus hard to interpret and explore. Explainable AI systems allow users to effectively understand, trust, and operate the models. Under this context, a recognizable model for qualitatively displaying probabilistic information is the Bayesian Network (BN), which provides a graphical visualization of quantitative beliefs about the conditional dependence and independence among random variables. We focus primarily on the dynamic explanation, which explains the reasoning process of a BN by exploring the means for analyzing the changes in the model in the light of new evidence. We extend Druzdzel’s concept about qualitative belief propagation, by introducing the idea of qualitative strength of edges in the active path, which is proportional to the Bayes factors of the Most Probable Explanation (MPE), or to the pairwise information entropy of variables (a.k.a. mutual information). We also present a full implementation in Java.
Other authorsSee publication -
Developing a Scalable Data Analytics Pipeline
Interservice/Industry Training, Simulation, and Education Conference, Paper No. 21151, 1-13 (2021)
According to the DoD, data is becoming a critical strategic asset for future conflicts. The DoD expects data to drive key decisions within areas such as supply chain and battlefield weapons. However, before data can be analyzed and mined for key insights it first needs to be usable. Collecting and managing data is a continuous challenge for both the DoD and many other organizations. Disparate collection systems and storage makes generating actionable insights from the data cumbersome. In some…
According to the DoD, data is becoming a critical strategic asset for future conflicts. The DoD expects data to drive key decisions within areas such as supply chain and battlefield weapons. However, before data can be analyzed and mined for key insights it first needs to be usable. Collecting and managing data is a continuous challenge for both the DoD and many other organizations. Disparate collection systems and storage makes generating actionable insights from the data cumbersome. In some cases, data management and configuration can make up the bulk of data analysis projects. As organizations seek to become data driven, they not only need to think about analysis strategies, but also data management to ensure data collected is usable. Work presented describes the development of a scalable data analytics pipeline used to analyze aircraft performance with-in a larger defensive counter air scenario. Currently today, air combat exercises and simulations generate large amounts of data with different structures without any method for combining and analyzing the data to improve chances of mission success. The architecture was designed to work optimally in an environment that ingests data in many different formats. As designed, this novel data pipeline allows for scalability, processing optimization/parallelization, and most importantly provides insulation from data format changes. The method developed pairs an unstructured data lake with a structured data warehouse to ensure a variety of data sources can be used to discover insights for improved warfighter decision making. This architecture allows for raw data formats from many different sources to be ingested, parsed, and stored into a common format such that the analytics techniques are re-usable and scalable. The paper describes the development and scaling process, providing a tangible example of how to manage data from a complex simulation for ingestion into a data analysis process.
Other authors -
Using Machine Learning for Battle Management Analysis
Interservice/Industry Training, Simulation, and Education Conference, Paper No. 21150, 1-14 (2021)
As the U.S. military begins to explore using autonomous and artificially intelligent agents, serious consideration must be given to how these agents will interact with humans. Adding AI teammates will require understanding current warfighter behavior and how AI agents can augment their capabilities. The goal is to allow AI agents to perform low level and dangerous tasks so humans can focus on high-level battle management. However, in order to conduct this battle management role successfully…
As the U.S. military begins to explore using autonomous and artificially intelligent agents, serious consideration must be given to how these agents will interact with humans. Adding AI teammates will require understanding current warfighter behavior and how AI agents can augment their capabilities. The goal is to allow AI agents to perform low level and dangerous tasks so humans can focus on high-level battle management. However, in order to conduct this battle management role successfully operators will require high quality data driven insights that help them make sense of an increasingly complex battlespace. This work begins to look at quantifying and measuring behavior in an air combat simulation using machine learning. The goal is to build an understanding of key performance metrics that help drive mission success or failure. From these insights machine learning agents can be created and tuned to properly weight or select the correct behaviors to maximize the warfighters chances of winning. The initial data analysis strategy is based around answering two simple questions: 1) did blue win? 2) if not, why? The analysis specifically looks at determining the importance of different metrics to the outcome of the scenario. Run level metrics like loss exchange ratio are fed into a Random Forest classifier. This classifier makes a win or loss prediction based on scenario metrics. Then, based on the importance of each feature for making the win/loss decision the relative importance of each metric can be gauged. Initial analysis of the data suggests only a handful of traditional performance metrics play a role determining win/loss for the scenario. The final paper will describe model development and the analysis results. Ultimately, the paper will provide insight for the broader community on how to use ML driven methods to develop battle analysis insights for the warfighter.
Other authors -
AlphaMosaic: An Artificially Intelligent Battle Management Architecture
Journal of Aerospace Computing, Information and Communication
Warfare is increasing in complexity, speed, and scale—not only due to enhanced technological capabilities but also from the employment methodologies associated with them. Incorporating artificial intelligence (AI) technology into this realm is a cogent solution to help address these complications because of the reduced cost, reduced risk to human life, and increased capability to rapidly adapt to changing environments. However, the introduction of AI comes with a host of new considerations. If…
Warfare is increasing in complexity, speed, and scale—not only due to enhanced technological capabilities but also from the employment methodologies associated with them. Incorporating artificial intelligence (AI) technology into this realm is a cogent solution to help address these complications because of the reduced cost, reduced risk to human life, and increased capability to rapidly adapt to changing environments. However, the introduction of AI comes with a host of new considerations. If AI is to be successfully integrated into air combat, humans must be included in the AI processing loop, and human interaction with AI decision loops must be frictionless. Additionally, AI-supported battle management systems must be designed for high and increasing human trust across dynamically changing scenarios. This paper presents AlphaMosaic, an AI battle manager developed as part of the Defense Advanced Research Projects Agency Air Combat Evolution program that is designed to incorporate human feedback in a manner conducive to true manned–unmanned aircraft teaming in beyond visual range air-combat scenarios.
Other authorsSee publication -
Manned-Unmanned Teaming: Research and Applications
Proceedings of the 2021 HFES 65th International Annual Meeting
This panel will discuss issues related to Manned-Unmanned Teaming (MUMT) technologies. Panelists were selected to represent diverse topics and each will provide a unique perspective on the MUMT challenge space. Joseph Lyons will frame the discussion and introduce the panelists. Each panelist will provide an overview of the MUMT research/applications they are involved in. Chris Miller will discuss an ongoing project looking at MUMT applications broadly across the enterprise and is seeking to…
This panel will discuss issues related to Manned-Unmanned Teaming (MUMT) technologies. Panelists were selected to represent diverse topics and each will provide a unique perspective on the MUMT challenge space. Joseph Lyons will frame the discussion and introduce the panelists. Each panelist will provide an overview of the MUMT research/applications they are involved in. Chris Miller will discuss an ongoing project looking at MUMT applications broadly across the enterprise and is seeking to identify the core systemic tenants of MUMT and metrics to gauge MUMT effectiveness. Jay Shively will discuss MUMT challenges in the context of UAS operations in the National aerospace. Nancy Cooke will discuss several MUMT research projects that emphasize teaming and associated research challenges. Col. Dan Javorsek will discuss recent MUMT programs at DARPA as well as where MUMT technologies can support Air Force applications. Phillip Walker will discuss the DARPA OFFSET program and human-swarm interactions, including human factor considerations of large swarm demonstration events.
Other authors -
Game Balancing using Koopman-based Learning
IEEE American Control Conference (ACC), pp. 710-717
This paper addresses the analysis of how the outcome of a zero-sum two-player game is affected by the value of numerical parameters that are part of the game rules and/or winning criterion. This analysis aims at selecting numerical values for such parameters that lead to games that are “fair” or “balanced” in spite of the fact that the two players may have distinct attributes/capabilities. Motivated by applications of game balancing for the commercial gaming industry, our effort is focused on…
This paper addresses the analysis of how the outcome of a zero-sum two-player game is affected by the value of numerical parameters that are part of the game rules and/or winning criterion. This analysis aims at selecting numerical values for such parameters that lead to games that are “fair” or “balanced” in spite of the fact that the two players may have distinct attributes/capabilities. Motivated by applications of game balancing for the commercial gaming industry, our effort is focused on complex multi-agent games for which low-dimensional models in the form of differential or difference equations are not possible or not available. To overcome this challenge, we use a parameter-dependent Koopman operator to model the game evolution, which we train using an ensemble of simulation traces of the actual game. This model is subsequently used to determine values for the game parameters that optimize the appropriate game balancing criterion. The approach proposed here is illustrated and validated on a minigame derived from the StarCraft II real-time strategy game from Blizzard Entertainment.
Other authorsSee publication -
Hierarchical Reinforcement Learning for Air-to-Air Combat
International Conference on Unmanned Aircraft Systems (ICUAS), pp. 275-284
Artificial Intelligence (AI) is becoming a critical component in the defense industry, as recently demonstrated by DARPA's AlphaDogfight Trials (ADT). ADT sought to vet the feasibility of AI algorithms capable of piloting an F-16 in simulated air-to-air combat. As a participant in ADT, Lockheed Martin's (LM) approach combines a hierarchical architecture with maximum-entropy reinforcement learning (RL), integrates expert knowledge through reward shaping, and supports modularity of policies. This…
Artificial Intelligence (AI) is becoming a critical component in the defense industry, as recently demonstrated by DARPA's AlphaDogfight Trials (ADT). ADT sought to vet the feasibility of AI algorithms capable of piloting an F-16 in simulated air-to-air combat. As a participant in ADT, Lockheed Martin's (LM) approach combines a hierarchical architecture with maximum-entropy reinforcement learning (RL), integrates expert knowledge through reward shaping, and supports modularity of policies. This approach achieved a 2 nd place finish in the final ADT event (among eight total competitors) and defeated a graduate of the US Air Force's (USAF) F-16 Weapons Instructor Course in match play.
Other authorsSee publication -
Artificial Intelligence Basics for the Flight Test Community
Cockpit
Artificial intelligence, powered by machine learning, already understands your voice, picks your Netflix programs, harvests your personal data, may even drive your car – and it is coming soon to a cockpit near you. Whether replacing traditional fly-by-wire control systems, controlling loyal wingmen, or providing an advanced tactical autopilot to free aircrew to accomplish higher mission tasks, ML-powered systems will require a motivated and informed test community to realize the potential…
Artificial intelligence, powered by machine learning, already understands your voice, picks your Netflix programs, harvests your personal data, may even drive your car – and it is coming soon to a cockpit near you. Whether replacing traditional fly-by-wire control systems, controlling loyal wingmen, or providing an advanced tactical autopilot to free aircrew to accomplish higher mission tasks, ML-powered systems will require a motivated and informed test community to realize the potential benefits. We present an overview of AI techniques and ML systems, their strengths, weaknesses, and potential test challenges as they apply to air combat. Parallels and differences between traditional flight testing and automated software test processes are also introduced. Finally, a new test paradigm is outlined, expanding “predict, test, validate” to gain insight on and add assurance to the perceptions, decisions, and actions of autonomous systems.
Other authorsSee publication -
Search for Small Temporal Modulations of Half-Lives of Radionuclides in the IMS Quality Control Data
CTBT: Science and Technology Conference (SnT 2021), Vienna, Austria, Contribution ID 169
Half-lives of radioisotopes are thought of as absolute constants of Nature. However, since the 1980s several experiments indicated that small percent or sub-percent level temporal modulations may exist, potentially correlated to variations of the solar neutrino flux. The issue has been debated by the nuclear theory community, since it would imply some new mechanism influencing weak decays, and of fundamental importance for nuclear physics. One problem is that high quality data collected over…
Half-lives of radioisotopes are thought of as absolute constants of Nature. However, since the 1980s several experiments indicated that small percent or sub-percent level temporal modulations may exist, potentially correlated to variations of the solar neutrino flux. The issue has been debated by the nuclear theory community, since it would imply some new mechanism influencing weak decays, and of fundamental importance for nuclear physics. One problem is that high quality data collected over extensive period of time are scarce. As regular part of their operation, the IMS monitoring stations take so-called quality control data daily, measuring a source of known isotopes for 30 minutes. The stations are at diverse geographic locations and using standardized equipment and sources. Such data are ideal to investigate longterm, small modulations of the half-lives due to an external influence, like solar neutrinos. We obtained and analyzed 15 years’ worth of quality control data from 11 IMS stations for annual and higher frequency modulations. We will present the results of this analysis, including an
upper limit of the amplitude of the modulations and suggestions for the design of a future high sensitivity experiment, dedicated to settle the issue of temporal modulations of half-lives due to
solar influence.Other authorsSee publication -
Context-sensitive, Distributed, Multi-Domain Adaptive Option Generation
roceedings of SPIE Artificial Intelligence and Machine Learning for Multi-Domain Operations Applications III, Vol. 11746
This paper presents how the combination of our Distributed InteRactivE C2 Tool (DIRECT) and Multi-Domain Adaptive Request Service (MARS) exploits underutilized resources through distributed adaptation of plans across domains. Deliberate planning processes, especially in the military, tend to be slow and unresponsive. Moreover, the introduction of more flexible assets such as multi-role aircraft introduces latent capacity that is often not exploited due to lack of flexible planning processes…
This paper presents how the combination of our Distributed InteRactivE C2 Tool (DIRECT) and Multi-Domain Adaptive Request Service (MARS) exploits underutilized resources through distributed adaptation of plans across domains. Deliberate planning processes, especially in the military, tend to be slow and unresponsive. Moreover, the introduction of more flexible assets such as multi-role aircraft introduces latent capacity that is often not exploited due to lack of flexible planning processes, thereby representing a significant opportunity to revolutionize the current system. We seek to overcome these challenges by enabling planners to respond to new requests during execution, through a semi-automated, distributed process that quickly generates options for adapting plans while meeting existing commitments, and presents them for human review. To accomplish this, we infer task state from reported mission states to simplify the manual process of tracking tasks and ensure that the adapted plan incorporates incomplete tasks but does not replan completed tasks. Our dynamic replanner generates options quickly, e.g., 316 seconds to adapt a plan with 345 missions to incorporate 1000 new tasks. This significantly increases utilization of resources, with 60%-70% of imagery requests for battle damage assessment being satisfied by multi-role fighters already flying. Finally, we provide options in context of the existing plan through adaptive option ranking that promotes options that meet operator preferences as judged from abstract evaluation factors designed to apply across different domains. The ranking achieves 80% accuracy for predicting the top option, presenting the preferred option to the operator the vast majority of the time.
Other authorsSee publication -
Introducing SMRTT: A Structural Equation Model of Multimodal Real-Time Trust
HRI '21 Companion: Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, Pages 126-130
Advances in autonomous technology have led to an increased interest in human-autonomy interactions. Generally, the success of these interactions is measured by the joint performance of the AI and the human operator. This performance depends, in part, on the operator having appropriate, or calibrated, trust of the autonomy. Optimizing the performance of human-autonomy teams therefore partly relies on the modeling and measuring of human trust. Theories and models have been developed on the…
Advances in autonomous technology have led to an increased interest in human-autonomy interactions. Generally, the success of these interactions is measured by the joint performance of the AI and the human operator. This performance depends, in part, on the operator having appropriate, or calibrated, trust of the autonomy. Optimizing the performance of human-autonomy teams therefore partly relies on the modeling and measuring of human trust. Theories and models have been developed on the factors influencing human trust in order to properly measure it. However, these models often rely on self-report rather than more objective, and real-time behavioral and physiological data. This paper seeks to build off of theoretical frameworks of trust by adding objective data to create a model capable of finer grain temporal measures of trust. Presented herein is SMRTT: SEM of Multimodal Real Time Trust. SMRTT leverages Structured Equation Modeling (SEM) techniques to arrive at a real time model of trust. Variables and factors from previous studies and existing theories are used to create components of SMRTT. The value of adding physiological data to the models to create real-time monitoring is discussed along with future plans to validate this model.
Other authorsSee publication -
Adaptive Policy Tree Algorithm to Approach Collision-Free Transmissions in Slotted ALOHA
IEEE 17th International Conference on Mobile Ad Hoc and Sensor Systems (MASS), pp. 138-146
A new adaptive transmission protocol is introduced to improve the performance of slotted ALOHA. Nodes use known periodic schedules as base policies with which they collaboratively learn how to transmit periodically in different time slots so that packet collisions are minimized. The Adaptive Policy Tree (APT) algorithm is introduced for this purpose, which results in APT-ALOHA. APT-ALOHA does not require the presence of a central repeater and uses explicit acknowledgements to confirm the…
A new adaptive transmission protocol is introduced to improve the performance of slotted ALOHA. Nodes use known periodic schedules as base policies with which they collaboratively learn how to transmit periodically in different time slots so that packet collisions are minimized. The Adaptive Policy Tree (APT) algorithm is introduced for this purpose, which results in APT-ALOHA. APT-ALOHA does not require the presence of a central repeater and uses explicit acknowledgements to confirm the reception of packets. It is shown that nodes using APT-ALOHA quickly converge to transmission schedules that are virtually collision-free, and that the throughput of APT-ALOHA resembles that of TDMA, where slots are pre-allocated to nodes. In particular, APT-ALOHA attains a successful utilization of time slots- over 70% on saturation mode.
Other authorsSee publication -
Anomalies in Radioactive Decay Rates: A Bibliography of Measurements and Theory
arXiv
Knowledge of the decay rates (or half-lives) of radioisotopes is critical in many fields, including medicine, archeology, and nuclear physics, to name just a few. Central to the many uses of radioisotopes is the belief that decay rates are fundamental constants of nature, just as the masses of the radioisotopes themselves are. Recently, the belief that decay rates are fundamental constants has been called into question following the observation of various reported anomalies in decay rates, such…
Knowledge of the decay rates (or half-lives) of radioisotopes is critical in many fields, including medicine, archeology, and nuclear physics, to name just a few. Central to the many uses of radioisotopes is the belief that decay rates are fundamental constants of nature, just as the masses of the radioisotopes themselves are. Recently, the belief that decay rates are fundamental constants has been called into question following the observation of various reported anomalies in decay rates, such as apparent periodic variations. The purpose of this bibliography is to collect in one place the relevant literature on both sides of this issue in the expectation that doing so will deepen our understanding of the available data.
Other authorsSee publication -
The Y's of Aircrew Health Effects: Electromagnetic Radiation, Offspring Gender Asymmetries, and Cancer
Society of Experimental Test Pilots (SETP) Great Lakes Regional Symposium
The concept that aircrew father more daughters than sons is a persistent rumor within aviation circles. This study investigated the relationship between aircrew offspring gender asymmetries and cancer incidence rates to highlight aviation unique environmental influences (non-ionizing electromagnetic radiation, cosmic ray exposure, G-forces, etc.). Although studies of this nature have been performed in the past they were limited to small sample sizes giving rise to poor statistics. In what…
The concept that aircrew father more daughters than sons is a persistent rumor within aviation circles. This study investigated the relationship between aircrew offspring gender asymmetries and cancer incidence rates to highlight aviation unique environmental influences (non-ionizing electromagnetic radiation, cosmic ray exposure, G-forces, etc.). Although studies of this nature have been performed in the past they were limited to small sample sizes giving rise to poor statistics. In what follows I present the results of a much larger dataset that included 11,178 births and permitted a breakdown into aircraft category and individual aircraft mission design series. Offspring gender asymmetries serve as a reasonable cancer risk indicator and aircrew cancer incidence rates are 13.9 times greater than the U.S. population for all aircrew surveyed. Following a discussion of the role non-ionizing radiation plays in tautomeric DNA mutations I conclude with a self-identification, high-risk checklist to aid aircrew in health screening decisions.
-
Engineering Judgment Tradecraft: Strategies for Transcending the Limits of Uncertainty from Human Agency
Society of Experimental Test Pilots (SETP) East Coast Regional Symposium
Engineering judgment is at the heart of the most pivotal flight test and aerospace design, programmatic, and execution decisions. However, unlike much of the engineering design process based on quantitative data and empirical modeling, engineering judgment is fundamentally a mental process which often seems to be more art than science. This is because understanding the mental process of engineering judgment is hindered by the lack of conscious awareness of how our minds work. Many functions…
Engineering judgment is at the heart of the most pivotal flight test and aerospace design, programmatic, and execution decisions. However, unlike much of the engineering design process based on quantitative data and empirical modeling, engineering judgment is fundamentally a mental process which often seems to be more art than science. This is because understanding the mental process of engineering judgment is hindered by the lack of conscious awareness of how our minds work. Many functions associated with perception, memory, and information processing are conducted prior to and independently of any conscious direction giving rise to troubling cognitive biases that negatively impact judgment and decision making. Fortunately, these weaknesses and biases inherent in human thinking processes can be alleviated by conscious application of tools and techniques that should be in every flight test engineer’s toolkit. In what follows I propose the Fragility-Effectiveness Assessment to systematically improve your application of the engineering judgment tradecraft by helping to externalize and decompose tough decisions.
-
Test Planning and Risk Mitigation Strategies for Complex Aircraft Systems
Cockpit
Modern aircraft are increasingly complex and the relationships between integrated subsystems are not always well understood by designers or testers. As a result of this complexity, the ability to predict outcomes during test is often limited, with the causes of failures not easily understood beforehand. Additionally, it is frequently impossible to design flight test programs that test all possible system modes, environments, and use cases, making it extremely challenging to verify full system…
Modern aircraft are increasingly complex and the relationships between integrated subsystems are not always well understood by designers or testers. As a result of this complexity, the ability to predict outcomes during test is often limited, with the causes of failures not easily understood beforehand. Additionally, it is frequently impossible to design flight test programs that test all possible system modes, environments, and use cases, making it extremely challenging to verify full system performance for end users. We propose specific risk management and test strategies for flight testing complex systems based on case studies that include F-16 Active Electronically Scanned Array (AESA) radar integration testing and the F-35 arresting hook test safety review board assessment. These case studies illustrate various types of complexity and show how the proposed strategies can mitigate technical and safety risk for complex systems.
Other authorsSee publication -
Spectral Content in 3H and 14C Decays: A Review of Five Experiments
Quarterly Physics Review, Vol. 3, pp. 1-8
We conduct a generalized spectral analysis of previously published data to re-examine new reports of annual and monthly periodicities in the decay of 3H and 14C. We find no common spectral content in two pairs of simultaneously measured 3H and 14C samples, suggesting fluctuations over the nearly nine year experiment are systematic effects rather than evidence of solar influence on decay rates. Direct comparisons to three other 3H experiments with anomalous results also suggest the presence of…
We conduct a generalized spectral analysis of previously published data to re-examine new reports of annual and monthly periodicities in the decay of 3H and 14C. We find no common spectral content in two pairs of simultaneously measured 3H and 14C samples, suggesting fluctuations over the nearly nine year experiment are systematic effects rather than evidence of solar influence on decay rates. Direct comparisons to three other 3H experiments with anomalous results also suggest the presence of systematic effects rather than the appearance of new physics.
Other authorsSee publication -
Flutter Testing on a Budget: Analysis and Flight Test Techniques in a Resource Constrained Environment
Society of Experimental Test Pilots (SETP) Great Lakes Regional Symposium
Flight test professionals not associated with major acquisitions programs are often forced to make difficult tradeoffs when presented with limited budgets and rapid development timelines. Following a brief description of the generic challenges associated with structural testing in these resource-constrained programs we summarize some specific techniques test teams may utilize to improve safety. We introduce frequency analysis techniques that exploit comparative power statistics especially…
Flight test professionals not associated with major acquisitions programs are often forced to make difficult tradeoffs when presented with limited budgets and rapid development timelines. Following a brief description of the generic challenges associated with structural testing in these resource-constrained programs we summarize some specific techniques test teams may utilize to improve safety. We introduce frequency analysis techniques that exploit comparative power statistics especially useful for vehicles without a flutter excitation system. We also introduce a method of monitoring for flutter using the telltale statistical signature that manifests in residuals just prior to an impending critical transition. In particular, these analysis methods--when properly integrated into the flight test program using existing flight test techniques--can help reduce uncertainties, assist in risk management, and improve overall test efficiency.
Other authors -
The Mishap that Never Happened: Addressing the Missing Data Problem in Flight Test
Society of Experimental Test Pilots (SETP) Flight Test Safety Conference
The flight test community struggles with a missing data problem that fundamentally opposes safety. This arises from our inability to realize and quantify the probabilities associated with alternate futures for catastrophes with low event rates. As a result, new and existing mishap prevention measures suffer
from two crippling challenges: resistance to initial funding and proof it is working once implemented. In an extension of earlier work applying Complexity Theory to flight test we show…The flight test community struggles with a missing data problem that fundamentally opposes safety. This arises from our inability to realize and quantify the probabilities associated with alternate futures for catastrophes with low event rates. As a result, new and existing mishap prevention measures suffer
from two crippling challenges: resistance to initial funding and proof it is working once implemented. In an extension of earlier work applying Complexity Theory to flight test we show that the combined air vehicle, test team, and acquisition community constitute a complex adaptive system. In the prediction of the behavior for such systems, we influence rather than control outcomes and can improve safety by cultivating assertive skeptics willing to challenge status quo in the flight test community. In this paper we provide a “behind the glossy brochure” look at the challenges and assertive skeptics who made a recently celebrated mishap prevention program a reality.Other authors -
Enhancing Flight Test Safety with Real-time Early Warning Techniques
Cockpit, Jan-Jun
Contemporary approaches to aerospace vehicle system monitoring rely heavily on thresholds that represent a compromise between providing warning early enough to avert a mishap while simultaneously minimizing false alarms. While this reliance on thresholds has been in place for decades and has permeated both cockpits and control rooms, we often find it insufficient when retrospectively analyzing data from an accident. To modernize and enhance flight test safety, we introduce new methods of…
Contemporary approaches to aerospace vehicle system monitoring rely heavily on thresholds that represent a compromise between providing warning early enough to avert a mishap while simultaneously minimizing false alarms. While this reliance on thresholds has been in place for decades and has permeated both cockpits and control rooms, we often find it insufficient when retrospectively analyzing data from an accident. To modernize and enhance flight test safety, we introduce new methods of monitoring for anomalous patterns of interaction rather than for thresholds exceedance. For systems with a well characterized baseline we show how the Inductive Monitoring System (IMS), utilized by NASA in the aftermath of the Columbia accident, might be implemented in real-time to provide earlier warning than currently employed techniques. For systems without such a baseline, we introduce new developments in statistical methods relating to critical slowing down, first applied in medicine and physics, which show promise for adaptation to flight test. Finally, the familiar resource constrained environment leads to a reliance on increased instrumentation that is challenging the limits of the current “one-sensor, one-indicator” threshold paradigm. Existing methods thus fail to accurately reflect the true complexity of a vehicle rich with interdependent interacting systems. We highlight these concepts in a brief summary of the 2001 Air Transat Flight 236 deadstick landing in the Azores. We then suggest control room and cockpit modifications to better display the information gleaned using these novel monitoring methods.
Other authors -
Cultivating the Assertive Skeptic: A Proposal to Modernize Flight Test Safety to Address Human Agency
Cockpit Magazine, Jan-Jun, pp. 75-95
Like all fields of research, flight test continues to evolve as new information and techniques are introduced. Many times this change happens gradually, leading to subtle overall improvements such as those associated with data collection and analysis. However, with respect to safety, the changes are often characterized by discrete responses to specific catastrophic events. After a brief historical summary of worldwide and USAF fatal aircraft accidents, we demonstrate that in the modern era…
Like all fields of research, flight test continues to evolve as new information and techniques are introduced. Many times this change happens gradually, leading to subtle overall improvements such as those associated with data collection and analysis. However, with respect to safety, the changes are often characterized by discrete responses to specific catastrophic events. After a brief historical summary of worldwide and USAF fatal aircraft accidents, we demonstrate that in the modern era, human agency has been a dominant causal factor. This has lead to an effective plateau in the mishap rate that seems unlikely to change unless we shift to better prepare for inevitable but unpredictable events. Fortunately, we are able to borrow lessons from the nascent field of Complexity Theory which specifically focuses on adaptive systems such as those which include human agency. In the end, forcing ourselves to acknowledge uncertainty will help us design more robust aircraft systems and better posture us to respond to unanticipated, high impact outcomes. By investigating the psychology associated with cognitive biases and placing an emphasis on decision making and judgment, we may better shape our perspective. Shifting to a more probabilistic mindset results in flight test professionals skeptical of systems optimized by a resource constrained acquisition environment and will help us design robustness to prepare for human agency. Finally, continuing this educational outreach and line of research on the psychology and uncertainty associated with human agency will help shift the flight test community to include more assertive skeptics. Embracing this fundamental change to the flight test culture will modify the way we approach risk assessment/management and is one promising method of modernizing the flight test safety process.
-
Modernizing Flight Test Safety to Address Human Agency
The ITEA Journal of Test and Evaluation, Vol. 37, pp. 325-332
After a brief historical summary of worldwide and USAF fatal aircraft accidents, it has been
demonstrated that in the modern era, human agency has been a dominant causal factor. This
has led to an effective plateau in the mishap rate that seems unlikely to change unless we shift to better prepare for unpredictable events. The promising nascent field of Complexity Theory, which specifically focuses on adaptive systems such as those that include human agency, offers a fresh perspective on…After a brief historical summary of worldwide and USAF fatal aircraft accidents, it has been
demonstrated that in the modern era, human agency has been a dominant causal factor. This
has led to an effective plateau in the mishap rate that seems unlikely to change unless we shift to better prepare for unpredictable events. The promising nascent field of Complexity Theory, which specifically focuses on adaptive systems such as those that include human agency, offers a fresh perspective on the inclusion of uncertainty. I suggest that this more probabilistic representation will help cultivate assertive skeptics within the flight test community, leading to the design of safer aerospace systems more robust to execution errors arising from human agency. -
A Formal Risk-Effectiveness Analysis Proposal for the Compartmentalized Intelligence Security Structure
International Journal of Intelligence and Counterintelligence, Vol. 28, pp. 734-761
Knowledge can be a dominant source of power, and action is required to determine if the current information security architecture is jeopardizing U.S. strategic advantage. Intelligence is the dial tone of national security which, without an appropriate risk-effectiveness assessment, may be in jeopardy of being relegated to a simple data collection service. The nature of compartmented programs is such that individuals must demonstrate a ‘‘need to know’’ which inherently relies on an outside…
Knowledge can be a dominant source of power, and action is required to determine if the current information security architecture is jeopardizing U.S. strategic advantage. Intelligence is the dial tone of national security which, without an appropriate risk-effectiveness assessment, may be in jeopardy of being relegated to a simple data collection service. The nature of compartmented programs is such that individuals must demonstrate a ‘‘need to know’’ which inherently relies on an outside observer. As the number of such programs rises, the burden of identifying common elements of tangential programs can be overwhelming. While a master strategy that identifies the trade-offs of compartmentalization is undoubtedly performed upon program inception, the need for a reassessment can often become out-prioritized by seemingly more pressing issues. The existing environment of fiscal austerity may prove an ideal time for such an assessment since, according to the DNI, only God enjoys the vantage point to see all programs.
Other authors -
Comparative Study of Beta-Decay Data for Eight Nuclides Measured at the Physikalisch-Technische Bundesanstalt
Astroparticle Physics, Vol. 59, pp. 47-58
We present the results of time-series analyses of data, kindly provided by the Physikalisch-Technische Bundesanstalt, concerning the beta-decays of Ag108, Ba133, Cs137, Eu152, Eu154, Kr85, Ra226, and Sr90. From measurements of the detector currents, we find evidence of annual oscillations (especially for Ra226), and for several solar r-mode oscillations. It is notable that the frequencies of these r-mode oscillations correspond to exactly the same sidereal rotation rate (12.08 year−1) that we…
We present the results of time-series analyses of data, kindly provided by the Physikalisch-Technische Bundesanstalt, concerning the beta-decays of Ag108, Ba133, Cs137, Eu152, Eu154, Kr85, Ra226, and Sr90. From measurements of the detector currents, we find evidence of annual oscillations (especially for Ra226), and for several solar r-mode oscillations. It is notable that the frequencies of these r-mode oscillations correspond to exactly the same sidereal rotation rate (12.08 year−1) that we have previously identified in r-mode oscillations detected in both Mt Wilson solar diameter data and Lomonosov Moscow State University Sr90 beta-decay data. Ba133 is found to be anomalous in that current measurements for this nuclide have a much larger variation (by 4 σ) than those of the other nuclides. It is interesting that analysis of variability measurements in the PTB files yields strong evidence for an oscillation for Ba133 but only weak evidence for Ra226.
Other authorsSee publication -
Probing Uncertainty, Complexity, and Human Agency in Intelligence
Intelligence and National Security, Vol. 29, pp. 639-653
Geopolitical dynamics associated with nuclear proliferation, the Arab Spring, the rapid rise of Chinese power, an oil-fueled Russian resurgence, and the post-Afghan and Iraq eras will demand significant changes in intelligence focus, processes, and resources. Nearly a decade after intelligence failures required a restructuring of the Intelligence Community with mandates for a scientific approach to intelligence analysis, current efforts continue to focus on overly deterministic individual…
Geopolitical dynamics associated with nuclear proliferation, the Arab Spring, the rapid rise of Chinese power, an oil-fueled Russian resurgence, and the post-Afghan and Iraq eras will demand significant changes in intelligence focus, processes, and resources. Nearly a decade after intelligence failures required a restructuring of the Intelligence Community with mandates for a scientific approach to intelligence analysis, current efforts continue to focus on overly deterministic individual analyst methods. We argue for a process-oriented approach to analysis resembling the collaborative scientific process successful in other professions that is built on shared theory and models. After demonstrating that events in the real world are path dependent and contingent on deterministic and random elements, we highlight the role of uncertainty in intelligence analysis with specific emphasis on intelligence failures. We then describe how human agency in an interconnected and interdependent system leads to a landscape of dancing strategies as agents dynamically modify their responses to events. Unfortunately, the consequences of the present deterministic intelligence mindset are significant time delays in adjusting to emerging adversaries leading to an increased susceptibility to intelligence failures. In contrast with the existing analyst-centric methods, we propose a risk management approach enhanced by outside collaboration on theory and models that embrace lessons from the twentieth-century science of uncertainty, human agency, and complexity.
Other authorsSee publication -
Time-Varying Nuclear Decay Parameters and Dark Matter
arXiv
Recently published data suggest a possible solar influence on some nuclear decay rates, including evidence for an annual variation attributed to the varying Earth-Sun distance. Here, we consider the possibility that the annual signal seen by the DAMA collaboration, and interpreted by them as evidence for dark matter, may in fact be due to the radioactive contaminant K-40, which is known to be present in their detector. We also consider the possibility that part of the DAMA signal may arise from…
Recently published data suggest a possible solar influence on some nuclear decay rates, including evidence for an annual variation attributed to the varying Earth-Sun distance. Here, we consider the possibility that the annual signal seen by the DAMA collaboration, and interpreted by them as evidence for dark matter, may in fact be due to the radioactive contaminant K-40, which is known to be present in their detector. We also consider the possibility that part of the DAMA signal may arise from relic big-bang neutrinos.
Other authorsSee publication -
Stability of the IMS Radionuclide Detector Network and Lessons Learned for Exotic Physics Searches
CTBT: Science and Technology Conference (SnT 2013), Vienna, Austria
We report the results from spectral analysis of the International Monitoring System (IMS) Radionuclide Aerosol Sampler/Analyzer (RASA) system. We demonstrate unusual oscillations in the check source photo peaks that mimic recent reports of nuclear decay anomalies. Since many Dark Matter and Decay Anomaly Reports rely on the stability of a single detector RASA diagnostics and multiple geographic locations offer a unique capability to characterize these oscillations. Our analysis supports a…
We report the results from spectral analysis of the International Monitoring System (IMS) Radionuclide Aerosol Sampler/Analyzer (RASA) system. We demonstrate unusual oscillations in the check source photo peaks that mimic recent reports of nuclear decay anomalies. Since many Dark Matter and Decay Anomaly Reports rely on the stability of a single detector RASA diagnostics and multiple geographic locations offer a unique capability to characterize these oscillations. Our analysis supports a conventional explanation for the observed oscillations with implications for Anomaly Searches.
Other authors -
Power-Spectrum Analysis of Reconstructed DAMA Data
arXiv
Claims by the DAMA (DArk MAtter) collaboration to have detected an annually varying signal consistent with models of dark matter appear to be at variance with results from other dark-matter searches. To further understand the DAMA results, we have carried out an independent analysis of DAMA data reconstructed from published figures. In addition to reexamining the Lomb-Scargle and chi-square analyses previously carried out by the DAMA collaboration, we carry out two new likelihood analyses and a…
Claims by the DAMA (DArk MAtter) collaboration to have detected an annually varying signal consistent with models of dark matter appear to be at variance with results from other dark-matter searches. To further understand the DAMA results, we have carried out an independent analysis of DAMA data reconstructed from published figures. In addition to reexamining the Lomb-Scargle and chi-square analyses previously carried out by the DAMA collaboration, we carry out two new likelihood analyses and a new chi-square analysis, focusing attention on the treatment of experimental errors and binning. We confirm the existence of an annual oscillation, with a maximum in early June, but at a lower significance level than previously reported.
Other authorsSee publication -
Spectral Content of 22Na/44Ti Decay Data: Implications for a Solar Influence
Astrophysics and Space Science, Vol. 344, pp. 297-303
We report a reanalysis of data on the measured decay rate ratio 22Na/44Ti which were originally published by Norman et al., and interpreted as supporting the conventional hypothesis that nuclear decay rates are constant and not affected by outside influences. We find upon a more detailed analysis of both the amplitude and the phase of the Norman data that they actually favor the presence of an annual variation in 22Na/44Ti, albeit weakly. Moreover, this conclusion holds for a broad range of…
We report a reanalysis of data on the measured decay rate ratio 22Na/44Ti which were originally published by Norman et al., and interpreted as supporting the conventional hypothesis that nuclear decay rates are constant and not affected by outside influences. We find upon a more detailed analysis of both the amplitude and the phase of the Norman data that they actually favor the presence of an annual variation in 22Na/44Ti, albeit weakly. Moreover, this conclusion holds for a broad range of parameters describing the amplitude and phase of an annual sinusoidal variation in these data. The results from this and related analyses underscore the growing importance of phase considerations in understanding the possible influence of the Sun on nuclear decays. Our conclusions with respect to the phase of the Norman data are consistent with independent analyses of solar neutrino data obtained at Super-Kamiokande-I and the Sudbury Neutrino Observatory (SNO).
Other authorsSee publication -
The Case for a Solar Influence on Certain Nuclear Decay Rates
arXiv
Power-spectrum analyses of the decay rates of certain nuclides reveal (at very high confidence levels) an annual oscillation and periodicities that may be attributed to solar rotation and to solar r-mode oscillations. A comparison of spectrograms (time-frequency displays) formed from decay data and from solar neutrino data reveals a common periodicity with frequency 12.5 year-1, which is indicative of the solar radiative zone. We propose that the neutrino flux is modulated by the solar magnetic…
Power-spectrum analyses of the decay rates of certain nuclides reveal (at very high confidence levels) an annual oscillation and periodicities that may be attributed to solar rotation and to solar r-mode oscillations. A comparison of spectrograms (time-frequency displays) formed from decay data and from solar neutrino data reveals a common periodicity with frequency 12.5 year-1, which is indicative of the solar radiative zone. We propose that the neutrino flux is modulated by the solar magnetic field (via Resonant Spin Flavor Precession) in that region, and we estimate the force and the torque that could be exerted on a nuclide by the solar neutrino flux.
Other authorsSee publication -
An Analysis of Apparent R-mode Oscillations in Solar Activity, the Solar Diameter, the Solar Neutrino Flux, and Nuclear Decay Rates, with Implications Concerning the Sun’s Internal Structure and Rotation, and Neutrino Processes
Astroparticle Physics, Vol. 42, pp. 62-69
This article presents a comparative analysis of solar activity data, Mt Wilson diameter data, Super-Kamiokande solar neutrino data, and nuclear decay data acquired at the Lomonosov Moscow State University (LMSU). We propose that salient periodicities in all of these datasets may be attributed to r-mode oscillations. Periodicities in the solar activity data and in Super-Kamiokande solar neutrino data may be attributed to r-mode oscillations in the known tachocline, with normalized radius in the…
This article presents a comparative analysis of solar activity data, Mt Wilson diameter data, Super-Kamiokande solar neutrino data, and nuclear decay data acquired at the Lomonosov Moscow State University (LMSU). We propose that salient periodicities in all of these datasets may be attributed to r-mode oscillations. Periodicities in the solar activity data and in Super-Kamiokande solar neutrino data may be attributed to r-mode oscillations in the known tachocline, with normalized radius in the range 0.66–0.74, where the sidereal rotation rate is in the range 13.7–14.6 year−1. We propose that periodicities in the Mt Wilson and LMSU data may be attributed to similar r-mode oscillations where the sidereal rotation rate is approximately 12.0 year−1, which we attribute to a hypothetical “inner” tachocline separating a slowly rotating core from the radiative zone. We also discuss the possible role of the Resonant Spin Flavor Precession (RSFP) process, which leads to estimates of the neutrino magnetic moment and of the magnetic field strength in or near the solar core.
Other authorsSee publication -
Concerning the Time Dependence of the Decay Rate of 137Cs
Applied Radiation and Isotopes, Vol. 74, pp. 50-55
The decay rates of eight nuclides (85Kr, 90Sr, 108Ag, 133Ba, 137Cs, 152Eu, 154Eu, and 226Ra) were monitored by the standards group at the Physikalisch-Technische Bundesanstalt (PTB), Braunschweig, Germany, over the time frame June 1999 to November 2008. We find that the PTB measurements of the decay rate of 137Cs show no evidence of an annual oscillation, in agreement with the recent report by Bellotti et al. However, power spectrum analysis of PTB measurements of a 133Ba standard, measured in…
The decay rates of eight nuclides (85Kr, 90Sr, 108Ag, 133Ba, 137Cs, 152Eu, 154Eu, and 226Ra) were monitored by the standards group at the Physikalisch-Technische Bundesanstalt (PTB), Braunschweig, Germany, over the time frame June 1999 to November 2008. We find that the PTB measurements of the decay rate of 137Cs show no evidence of an annual oscillation, in agreement with the recent report by Bellotti et al. However, power spectrum analysis of PTB measurements of a 133Ba standard, measured in the same detector system, does show such evidence. This result is consistent with our finding that different nuclides have different sensitivities to whatever external influences are responsible for the observed periodic variations.
Other authorsSee publication -
Additional Experimental Evidence for a Solar Influence on Nuclear Decay Rates
Astroparticle Physics, Vol. 37, pp. 81-88
Additional experimental evidence is presented in support of the recent hypothesis that a possible solar influence could explain fluctuations observed in the measured decay rates of some isotopes. These data were obtained during routine weekly calibrations of an instrument used for radiological safety at The Ohio State University Research Reactor using 36Cl. The detector system used was based on a Geiger–Müller gas detector, which is a robust detector system with very low susceptibility to…
Additional experimental evidence is presented in support of the recent hypothesis that a possible solar influence could explain fluctuations observed in the measured decay rates of some isotopes. These data were obtained during routine weekly calibrations of an instrument used for radiological safety at The Ohio State University Research Reactor using 36Cl. The detector system used was based on a Geiger–Müller gas detector, which is a robust detector system with very low susceptibility to environmental changes. A clear annual variation is evident in the data, with a maximum relative count rate observed in January/February, and a minimum relative count rate observed in July/August, for seven successive years from July 2005 to June 2011. This annual variation is not likely to have arisen from changes in the detector surroundings, as we show here.
Other authorsSee publication -
Analysis of Gamma Radiation from a Radon Source: Indications of a Solar Influence
Astroparticle Physics, Vol. 36, pp. 18-25
This article presents an analysis of about 29,000 measurements of gamma radiation associated with the decay of radon in a sealed container at the Geological Survey of Israel (GSI) Laboratory in Jerusalem between 28 January 2007 and 10 May 2010. These measurements exhibit strong variations in time of year and time of day, which may be due in part to environmental influences. However, time-series analysis reveals a number of periodicities, including two at approximately 11.2 year−1 and 12.5…
This article presents an analysis of about 29,000 measurements of gamma radiation associated with the decay of radon in a sealed container at the Geological Survey of Israel (GSI) Laboratory in Jerusalem between 28 January 2007 and 10 May 2010. These measurements exhibit strong variations in time of year and time of day, which may be due in part to environmental influences. However, time-series analysis reveals a number of periodicities, including two at approximately 11.2 year−1 and 12.5 year−1. We have previously found these oscillations in nuclear-decay data acquired at the Brookhaven National Laboratory and at the Physiklisch-Technische Bundesanstalt, and we have suggested that these oscillations are attributable to some form of solar radiation that has its origin in the deep solar interior. A curious property of the GSI data is that the annual oscillation is much stronger in daytime data than in nighttime data, but the opposite is true for all other oscillations. This may be a systematic effect but, if it is not, this property should help narrow the theoretical options for the mechanism responsible for decay-rate variability.
Other authorsSee publication -
Study of Nuclear Decays During a Solar Eclipse: Thule Greenland 2008
Astrophysics and Space Science, Vol. 342, pp. 9-13
Recent efforts to determine the cause of anomalous experimental nuclear decay fluctuations suggests a possible solar influence. Here we report on the results from several nuclear decay experiments performed at Thule Air Base in Greenland during the solar eclipse on 1 August 2008. Thule was ideal for this experiment due to its proximity to the magnetic north pole which amplified changes in the charged particle flux and provided relatively stabilized conditions for nearly all environmental…
Recent efforts to determine the cause of anomalous experimental nuclear decay fluctuations suggests a possible solar influence. Here we report on the results from several nuclear decay experiments performed at Thule Air Base in Greenland during the solar eclipse on 1 August 2008. Thule was ideal for this experiment due to its proximity to the magnetic north pole which amplified changes in the charged particle flux and provided relatively stabilized conditions for nearly all environmental factors. An exhaustive list of relevant factors were monitored during the eclipse to help rule out possible systematic effects in the event of unexpected results. We included measurements of temperature, pressure, and humidity as well as power supply outputs, neutron count rates, and the Earth’s local electric and magnetic fields. Nuclear decay measurements of 14C, 90Sr, 99Tc, 210Bi, 234Pa, and 241Am were made using Geiger-Müller (GM) ionization chambers. Although our data exhibit no evidence for a statistically significant change in the decay rate of any nuclide measured during the 1 August 2008 solar eclipse, small anomalies remain to be understood.
Other authorsSee publication -
F-22 Raptor GBU-39 Separation Test Results
AIAA Guidance, Navigation, and Control Conference, Vol. 9, pp. 7835-7846
The Guided Bomb Unit (GBU)-39 Small Diameter Bomb (SDB) enhanced F-22A air-to-ground combat capability by significantly increasing the number of targets while simultaneously permitting greater standoff engagement range. This four year, $25 million modernization developmental test effort incorporated several new capabilities and included store certification via loads, separations, and guided testing. In addition to the first ever supersonic SDB delivery, notable achievements included releases…
The Guided Bomb Unit (GBU)-39 Small Diameter Bomb (SDB) enhanced F-22A air-to-ground combat capability by significantly increasing the number of targets while simultaneously permitting greater standoff engagement range. This four year, $25 million modernization developmental test effort incorporated several new capabilities and included store certification via loads, separations, and guided testing. In addition to the first ever supersonic SDB delivery, notable achievements included releases from up to 50,000 feet and 1.72 Mach against targets over 70 nautical miles down range. Since the SDBs are carried internally, the dynamics associated with supersonic releases are complicated by shock transient effects that ultimately impact safe separation and subsequent guidance. The separations phase of this certification is summarized with a focus on bomb rack ejector settings and roll-rate instability issues.
Other authors -
Concerning the Phases of the Annual Variations of Nuclear Decay Rates
The Astrophysical Journal, Vol. 737, pp. 65-69
Recent analyses of data sets acquired at the Brookhaven National Laboratory and at the Physikalisch-Technische Bundesanstalt both show evidence of pronounced annual variations, suggestive of a solar influence. However, the phases of decay-rate maxima do not correspond precisely to the phase of minimum Sun-Earth distance, as might then be expected. We here examine the hypothesis that decay rates are influenced by an unknown solar radiation, but that the intensity of the radiation is influenced…
Recent analyses of data sets acquired at the Brookhaven National Laboratory and at the Physikalisch-Technische Bundesanstalt both show evidence of pronounced annual variations, suggestive of a solar influence. However, the phases of decay-rate maxima do not correspond precisely to the phase of minimum Sun-Earth distance, as might then be expected. We here examine the hypothesis that decay rates are influenced by an unknown solar radiation, but that the intensity of the radiation is influenced not only by the variation in Sun-Earth distance, but also by a possible north-south asymmetry in the solar emission mechanism. We find that this can lead to phases of decay-rate maxima in the range 0-0.183 or 0.683-1 (September 6 to March 8) but that, according to this hypothesis, phases in the range of 0.183-0.683 (March 8 to September 6) are "forbidden." We find that phases of the three data sets analyzed here fall in the allowed range.
Other authorsSee publication -
F-22 All Weather Fighter: Recent ECS Testing Results
Society of Flight Test Engineers International Symposium Proceedings
Since meeting initial operating capability (IOC) on 15 December, 2005 the F-22 Raptor has been operated in a wide range of locations with environments as diverse as the tropics and the frozen tundra. Each of the locations reflected environmental conditions significantly different from the Mojave Desert where the aircraft was flight tested. Budget constraints early in the program prohibited a complete characterization of the environmental control system (ECS) with an instrumented aircraft. This…
Since meeting initial operating capability (IOC) on 15 December, 2005 the F-22 Raptor has been operated in a wide range of locations with environments as diverse as the tropics and the frozen tundra. Each of the locations reflected environmental conditions significantly different from the Mojave Desert where the aircraft was flight tested. Budget constraints early in the program prohibited a complete characterization of the environmental control system (ECS) with an instrumented aircraft. This decision significantly impacted flight test and resulted in reactive testing, subsequent to IOC, to troubleshoot system behavior and test potential solutions. Over the last five years, the warfighter has encountered lingering ECS problems including elevated failure rates due to condensation and numerous issues with the liquid cooling subsystem. Lessons learned from developmental test are summarized with a focus on concepts of interest to the flight test community including: efforts in reducing condensation, cockpit conditioning system response, quantifying the effects of contaminates, and redundant leak detection in liquid cooling subsystems.
Other authors -
Power Spectrum Analysis of Physikalisch-Technische Bundesanstalt Decay-Rate Data: Evidence for Solar Rotational Modulation
Solar Physics, Vol. 267, pp. 251-265
Evidence for an anomalous annual periodicity in certain nuclear-decay data has led to speculation on a possible solar influence on nuclear processes. We have recently analyzed data concerning the decay rates of 36Cl and 32Si, acquired at the Brookhaven National Laboratory (BNL), to search for evidence that might be indicative of a process involving solar rotation. Smoothing of the power spectrum by weighted-running-mean analysis leads to a significant peak at frequency 11.18 year−1, which is…
Evidence for an anomalous annual periodicity in certain nuclear-decay data has led to speculation on a possible solar influence on nuclear processes. We have recently analyzed data concerning the decay rates of 36Cl and 32Si, acquired at the Brookhaven National Laboratory (BNL), to search for evidence that might be indicative of a process involving solar rotation. Smoothing of the power spectrum by weighted-running-mean analysis leads to a significant peak at frequency 11.18 year−1, which is lower than the equatorial synodic rotation rates of the convection and radiative zones. This article concerns measurements of the decay rates of 226Ra acquired at the Physikalisch-Technische Bundesanstalt (PTB) in Germany. We find that a similar (but not identical) analysis yields a significant peak in the PTB dataset at frequency 11.21 year−1, and a peak in the BNL dataset at 11.25 year−1. The change in the BNL result is not significant, since the uncertainties in the BNL and PTB analyses are estimated to be 0.13 year−1 and 0.07 year−1, respectively. Combining the two running means by forming the joint power statistic leads to a highly significant peak at frequency 11.23 year−1. We will briefly comment on the possible implications of these results for solar physics and for particle physics.
Other authorsSee publication -
Power Spectrum Analyses of Nuclear Decay Rates
Astroparticle Physics, Vol. 34, pp. 173-178
We provide the results from a spectral analysis of nuclear decay data displaying annually varying periodic fluctuations. The analyzed data were obtained from three distinct data sets: 32Si and 36Cl decays reported by an experiment performed at the Brookhaven National Laboratory (BNL), 56Mn decay reported by the Children’s Nutrition Research Center (CNRC), but also performed at BNL, and 226Ra decay reported by an experiment performed at the Physikalisch–Technische Bundesanstalt (PTB) in Germany.…
We provide the results from a spectral analysis of nuclear decay data displaying annually varying periodic fluctuations. The analyzed data were obtained from three distinct data sets: 32Si and 36Cl decays reported by an experiment performed at the Brookhaven National Laboratory (BNL), 56Mn decay reported by the Children’s Nutrition Research Center (CNRC), but also performed at BNL, and 226Ra decay reported by an experiment performed at the Physikalisch–Technische Bundesanstalt (PTB) in Germany. All three data sets exhibit the same primary frequency mode consisting of an annual period. Additional spectral comparisons of the data to local ambient temperature, atmospheric pressure, relative humidity, Earth–Sun distance, and their reciprocals were performed. No common phases were found between the factors investigated and those exhibited by the nuclear decay data. This suggests that either a combination of factors was responsible, or that, if it was a single factor, its effects on the decay rate experiments are not a direct synchronous modulation. We conclude that the annual periodicity in these data sets is a real effect, but that further study involving additional carefully controlled experiments will be needed to establish its origin.
Other authorsSee publication -
Power Spectrum Analysis of BNL Decay Rate Data
Astroparticle Physics, Vol. 34, pp. 121-127
Evidence for an anomalous annual periodicity in certain nuclear decay data has led to speculation concerning a possible solar influence on nuclear processes. As a test of this hypothesis, we here search for evidence in decay data that might be indicative of a process involving solar rotation, focusing on data for 32Si and 36Cl decay rates acquired at the Brookhaven National Laboratory. Examination of the power spectrum over a range of frequencies (10–15 year−1) appropriate for solar synodic…
Evidence for an anomalous annual periodicity in certain nuclear decay data has led to speculation concerning a possible solar influence on nuclear processes. As a test of this hypothesis, we here search for evidence in decay data that might be indicative of a process involving solar rotation, focusing on data for 32Si and 36Cl decay rates acquired at the Brookhaven National Laboratory. Examination of the power spectrum over a range of frequencies (10–15 year−1) appropriate for solar synodic rotation rates reveals several periodicities, the most prominent being one at 11.18 year−1 with power 20.76. We evaluate the significance of this peak in terms of the false-alarm probability, by means of the shuffle test, and also by means of a new test (the “shake” test) that involves small random time displacements. The last two tests are the more robust, and indicate that the peak at 11.18 year−1 would arise by chance only once out of about 107 trials. However, the fact that there are several peaks in the rotational search band suggests that modulation of the count rate involves several low-Q oscillations rather than a single high-Q oscillation, possibly indicative of a partly stochastic process. To pursue this possibility, we investigate the running-mean of the power spectrum, and identify a major peak at 11.93 year−1 with peak running-mean power 4.08. Application of the shuffle test indicates that there is less than one chance in 1011of finding by chance a value as large as 4.08. Application of the shake test leads to a more restrictive result that there is less than one chance in 1015 of finding by chance a value as large as 4.08. We find that there is notable agreement in the running-mean power spectra in the rotational search band formed from BNL data and from ACRIM total solar irradiance data. Since rotation rate estimates derived
Other authorsSee publication -
Periodicities in Nuclear Decay Data: Systematic Effects or New Physics?
American Institute of Physics Conference Proceedings, Vol. 1265, pp. 144-147
Recent comparisons of independent and unrelated nuclear decay experiments have shown unexplained oscillations that appear to be common in frequency and phase. The most logical
explanation for this fluctuation would be some common systematic or environmental factor.
In this paper we provide detailed spectral analysis comparisons of several environmental factors
with nuclear decay data from an experiment performed at the Brookhaven National Laboratory. We
demonstrate that, although…Recent comparisons of independent and unrelated nuclear decay experiments have shown unexplained oscillations that appear to be common in frequency and phase. The most logical
explanation for this fluctuation would be some common systematic or environmental factor.
In this paper we provide detailed spectral analysis comparisons of several environmental factors
with nuclear decay data from an experiment performed at the Brookhaven National Laboratory. We
demonstrate that, although none of the environmental factors investigated can be causal, comparisons with ACRIM solar irradiance measurements provide good agreement with the nuclear decay data. This analysis provides the first direct evidence that the cause of the fluctuations has a possible solar origin.Other authors -
Preliminary Results from Nuclear Decay Experiments Performed During the Solar Eclipse of August 1, 2008
American Institute of Physics Conference Proceedings, Vol. 1182, pp. 178-179
Recent development in efforts to determine the cause of anomalous experimental nuclear
decay fluctuations suggests a possible solar influence. Here we report on the preliminary results
from several nuclear decay experiments performed at Thule Air Base in Greenland during the
Solar Eclipse that took place on 1 August 2008. Because of the high northern latitude and time
of year, the Sun never set and thereby provided relatively stabilized conditions for nearly all
environmental…Recent development in efforts to determine the cause of anomalous experimental nuclear
decay fluctuations suggests a possible solar influence. Here we report on the preliminary results
from several nuclear decay experiments performed at Thule Air Base in Greenland during the
Solar Eclipse that took place on 1 August 2008. Because of the high northern latitude and time
of year, the Sun never set and thereby provided relatively stabilized conditions for nearly all
environmental factors. An exhaustive list of relevant factors were monitored during the eclipse
to help rule out possible systematic effects due to external influences. In addition to the normal
temperature, pressure, humidity, and cloud cover associated with the outside ambient observations,
we included similar measurements within the laboratory along with monitoring of the power supply
output, local neutron count rates, and the Earth’s local magnetic and electric fields.Other authors -
Investigation of Periodic Nuclear Decay Data with Spectral Analysis Techniques
American Institute of Physics Conference Proceedings, Vol. 1182, pp. 292-295
We provide the results from a spectral analysis of nuclear decay experiments displaying
unexplained periodic fluctuations. The analyzed data was from Mn decay reported by the Children's
Nutrition Research Center in Houston, 32Si decay reported by an experiment performed at
the Brookhaven National Laboratory, and 226Ra decay reported by an experiment performed at the
Physikalisch-Technische-Bundesanstalt in Germany. All three data sets possess the same primary
frequency mode…We provide the results from a spectral analysis of nuclear decay experiments displaying
unexplained periodic fluctuations. The analyzed data was from Mn decay reported by the Children's
Nutrition Research Center in Houston, 32Si decay reported by an experiment performed at
the Brookhaven National Laboratory, and 226Ra decay reported by an experiment performed at the
Physikalisch-Technische-Bundesanstalt in Germany. All three data sets possess the same primary
frequency mode consisting of an annual period. Additionally a spectral comparison of the local
ambient temperature, atmospheric pressure, relative humidity, Earth-Sun distance, and the plasma
speed and latitude of the heliospheric current sheet (HCS) was performed. Following analysis of
these six possible causal factors, their reciprocals, and their linear combinations, a possible link
between nuclear decay rate fluctuations and the linear combination of the HCS latitude and 1/R
motivates searching for a possible mechanism with such properties.Other authors -
Effect of Thrust Profile on Velocity Pointing Errors of Spinning Spacecraft
Advances in the Astronautical Sciences, Vol. 116, pp. 219-230
During axial thrusting maneuvers, spacecraft and rockets are often spinstabilized to ameliorate the effect of undesired transverse torques from thruster offset and misalignment. The velocity-pointing errors due to these undesired torques are inversely proportional to the square of the spin rate. Recent work shows that the spin-stabilized axial thrust maneuver can be improved considerably
by softening the ignition transient (for example, by increasing the thrust gradually from zero to…During axial thrusting maneuvers, spacecraft and rockets are often spinstabilized to ameliorate the effect of undesired transverse torques from thruster offset and misalignment. The velocity-pointing errors due to these undesired torques are inversely proportional to the square of the spin rate. Recent work shows that the spin-stabilized axial thrust maneuver can be improved considerably
by softening the ignition transient (for example, by increasing the thrust gradually from zero to maximum, rather than having a nearly instantaneous jump to maximum thrust). In previous work it is found that a linear-ramp thruster profile (for ignition and for burnout) provides a significant reduction in velocity-pointing errors. One advantage of this type of thrust profile is that it permits much smaller spin rates, which reduces the propellant mass required for spin up and spin down. We show that deviations from the linear thrust profile, such as a sinusoidal or an exponential profile, do not have significant effects on the inherent advantage of softening the ignition transient. However, increasing the duration of the thrust transient (in the profiles we examine) provides the
greatest reduction in velocity-pointing errors.Other authors -
Extension of Satellite Lifetime via Precision Pointing of Orbit Transfer Maneuvers
Advances in the Astronautical Sciences, Vol. 116, pp. 205-218
We describe an extremely precise, open-loop control of velocity pointing
for spin-stabilized rockets and spacecraft. This technique (Velocity Precisionpointing
Enhancement System) employs coupling between the spinning spacecraft
dynamics and the propulsion system characteristics to virtually eliminate
velocity-pointing error. By modifying an engine to have a softer ignition transient,
a reduction of nearly two orders of magnitude in velocity-pointing error
can be obtained…We describe an extremely precise, open-loop control of velocity pointing
for spin-stabilized rockets and spacecraft. This technique (Velocity Precisionpointing
Enhancement System) employs coupling between the spinning spacecraft
dynamics and the propulsion system characteristics to virtually eliminate
velocity-pointing error. By modifying an engine to have a softer ignition transient,
a reduction of nearly two orders of magnitude in velocity-pointing error
can be obtained. This reduction of the pointing error can be directly translated
into a savings of station-keeping propellant. Since less propellant is needed to
correct the error, more is available to keep the spacecraft in orbit. In this paper
we assess the mass savings achievable and calculate the potential extensions of
satellite lifetimes.Other authors -
Accelerator Search for Cosmic SIMPs
Nuclear Physics B - Proceedings Supplements, Vol. 124, pp. 205-208
We give limits on the contribution to the cosmic dark matter density of neutral, stable, strongly interacting massive particles (SIMPs). The limits are inferred from an accelerator mass spectrometry (AMS) experiment at the Purdue Rare Isotope Measurement Laboratory (PRIME Lab). The experiment accelerated nuclei of both gold and iron. The “SIMP signal” would be discovery of nuclei of these elements with anomalous masses. Since no such signal was observed for SIMP masses up to over 1 TeV, cosmic…
We give limits on the contribution to the cosmic dark matter density of neutral, stable, strongly interacting massive particles (SIMPs). The limits are inferred from an accelerator mass spectrometry (AMS) experiment at the Purdue Rare Isotope Measurement Laboratory (PRIME Lab). The experiment accelerated nuclei of both gold and iron. The “SIMP signal” would be discovery of nuclei of these elements with anomalous masses. Since no such signal was observed for SIMP masses up to over 1 TeV, cosmic SIMP density limits may be given. Determining the minimum time of exposure to SIMPs trapped in the galaxy is a crucial element in the analysis of each sample.
Other authorsSee publication -
Experimental Constraints on Strangelets and Other Exotic Nuclear Matter
Physical Review D, Vol. 67, pp. 034015
A reanalysis of data from a recent search for ultraheavy isotopes of gold and iron leads to new constraints on several classes of exotic objects. These include strangelets, MEMOs (metastable exotic multihypernuclear objects), and CHAMPs (charged massive particles) which may have been present in the data, but which could have nonetheless gone undetected due to the design of the original experiment. As a result of the new analysis we are able to greatly enlarge the exclusion regions for exotic…
A reanalysis of data from a recent search for ultraheavy isotopes of gold and iron leads to new constraints on several classes of exotic objects. These include strangelets, MEMOs (metastable exotic multihypernuclear objects), and CHAMPs (charged massive particles) which may have been present in the data, but which could have nonetheless gone undetected due to the design of the original experiment. As a result of the new analysis we are able to greatly enlarge the exclusion regions for exotic particles of mass M and charge Z, and provide limits as low as 10-11 for small M/Z, and 10-7 for M/Z up to 120.
Other authorsSee publication -
Astrophysical Detection of Heavy-Particle-Induced Spectral Shifts in Muonic Iron
Physical Review D, Vol. 66, pp. 123508
By significantly increasing the nuclear mass, a strongly interacting massive particle (SIMP) bound to an iron nucleus would cause a characteristic change in the spectrum of muonic iron. At temperatures high enough that such atoms are completely stripped of electrons, the effect is directly observable as a 0.2% shift in the energies of high angular momentum states. This phenomenon provides a new test for the existence of SIMPs, which have been proposed as dark matter candidates, and as…
By significantly increasing the nuclear mass, a strongly interacting massive particle (SIMP) bound to an iron nucleus would cause a characteristic change in the spectrum of muonic iron. At temperatures high enough that such atoms are completely stripped of electrons, the effect is directly observable as a 0.2% shift in the energies of high angular momentum states. This phenomenon provides a new test for the existence of SIMPs, which have been proposed as dark matter candidates, and as candidates for the lightest supersymmetric particle.
Other authorsSee publication -
Testing the Atomic Structure of Beryllium with AMS
Nuclear Instruments and Methods in Physics Research Section B: Beam Interactions with Materials and Atoms, Vol. 194, pp. 78-89
If the Pauli exclusion principle were violated, the electronic structure of Be could be 1s4 (denoted by Be′) rather than 1s22s2. This paper describes the results of an experimental search for Be′, carried out at PRIME Lab, the Purdue Rare Isotope Measurement Laboratory. In the process of setting stringent constraints on Be′ using samples of metallic Be, Be ore, natural gas, and air, we made several modifications to the PRIME Lab facility. These included a new Be-free source and the construction…
If the Pauli exclusion principle were violated, the electronic structure of Be could be 1s4 (denoted by Be′) rather than 1s22s2. This paper describes the results of an experimental search for Be′, carried out at PRIME Lab, the Purdue Rare Isotope Measurement Laboratory. In the process of setting stringent constraints on Be′ using samples of metallic Be, Be ore, natural gas, and air, we made several modifications to the PRIME Lab facility. These included a new Be-free source and the construction of a gas introduction system which coupled to the existing ion source. The modifications permitted us to reach limits which are a factor of nearly 300 better than those obtained in previous experiments.
Other authorsSee publication -
Search for Anomalously Heavy Nuclei in Gold and Iron
Physical Review D, Vol. 65, pp. 072003
There are a number of theoretical motivations for searching for anomalously heavy isotopes ZX of known elements, where Z is the nuclear charge of the anomalous nucleus X. Such nuclei could arise from the binding of a new strongly interacting massive particle (SIMP) to the nucleus of a known element, and could thus be detected as an anomalously heavy isotope of that element. SIMPs have been proposed as candidates for dark matter, and for the lightest supersymmetric particle, as well as a…
There are a number of theoretical motivations for searching for anomalously heavy isotopes ZX of known elements, where Z is the nuclear charge of the anomalous nucleus X. Such nuclei could arise from the binding of a new strongly interacting massive particle (SIMP) to the nucleus of a known element, and could thus be detected as an anomalously heavy isotope of that element. SIMPs have been proposed as candidates for dark matter, and for the lightest supersymmetric particle, as well as a possible explanation for ultra high-energy cosmic rays. A search for anomalous nuclei X has been performed by analyzing several unique samples including gold nuggets collected in Australia, Arizona and North Carolina, gold foils flown on NASA’s LDEF satellite, and an Fe meteorite. In each gold sample we scanned for Au isotopes with masses up to 1.67 TeV/c2 using PRIME Lab, the Purdue accelerator mass spectrometer facility. We have also searched for anomalous Fe isotopes with masses up to 0.65 TeV/c2 in the iron meteorite sample. We find no evidence for SIMPs in any of our samples, and our results set stringent limits on the abundance of anomalous isotopes of ordinary matter as a function of X mass.
Other authorsSee publication -
New Experimental Bounds on the Contributions to the Cosmological Density Parameter Ω from Strongly Interacting Massive Particles
The Astrophysical Journal, Vol. 568, pp. 1-8
Strongly interacting neutral massive particles (SIMPs) have been proposed as candidates for dark matter, as the lightest supersymmetric particle, as a possible explanation for ultra-high-energy cosmic rays, and as a dark matter solution to galactic structure problems. If bound to nuclei, SIMPs could manifest themselves as anomalously heavy isotopes of known elements. We analyze the results from a recent experimental search for SIMPs in a collection of gold and iron samples with various…
Strongly interacting neutral massive particles (SIMPs) have been proposed as candidates for dark matter, as the lightest supersymmetric particle, as a possible explanation for ultra-high-energy cosmic rays, and as a dark matter solution to galactic structure problems. If bound to nuclei, SIMPs could manifest themselves as anomalously heavy isotopes of known elements. We analyze the results from a recent experimental search for SIMPs in a collection of gold and iron samples with various exposures to cosmic rays and to a SIMP component of dark matter. The samples included gold flown on the NASA Long-Duration Exposure Facility, as well as geological samples and an iron meteorite. We show that the bounds on SIMPs from that experiment can be used to set nontrivial constraints on the SIMP contribution to the cosmological density parameter Ω.
Other authorsSee publication -
New Experimental Limits on Strongly Interacting Massive Particles at the TeV Scale
Physical Review Letters, Vol. 87, pp. 23180
We have carried out a search for strongly interacting massive particles (SIMPs) bound to Au and Fe nuclei, which could manifest themselves as anomalously heavy isotopes of these elements. Our samples included gold from the NASA Long Duration Exposure Facility satellite, RHIC at Brookhaven National Laboratory, and from various geological sources. We find no evidence for SIMPs in any of our samples, and our results set stringent limits (as low as ∼10-12) on the abundances of anomalous Au or Fe…
We have carried out a search for strongly interacting massive particles (SIMPs) bound to Au and Fe nuclei, which could manifest themselves as anomalously heavy isotopes of these elements. Our samples included gold from the NASA Long Duration Exposure Facility satellite, RHIC at Brookhaven National Laboratory, and from various geological sources. We find no evidence for SIMPs in any of our samples, and our results set stringent limits (as low as ∼10-12) on the abundances of anomalous Au or Fe isotopes with masses up to 1.67 and 0.65TeV/c2, respectively.
Other authorsSee publication -
Experimental Limits on the Existence of Strongly Interacting Massive Particles Bound to Gold Nuclei
Physical Review D, Vol. 64, pp. 012005
We report the results from an experimental search for strongly interacting massive particles bound to gold nuclei. A scan for heavy gold isotopes with masses ranging from 186.3 to 325.9 GeV/c2 was performed on laboratory gold and gold from western Australia using PRIME Lab, the Purdue Accelerator Mass Spectrometer facility. The results provide significant new constraints on current models which predict the existence of such particles with abundance ratios in the range 10-11–10-10.
Other authorsSee publication -
Exotic Particle Searches using the Purdue AMS Facility
American Institute of Physics Conference Proceedings, Vol. 576, pp. 382-385
Two exotic particle searches are being performed using the Accelerator Mass Spectrometer (AMS) at the Purdue Rare Isotope Measurement Laboratory (PRIME Lab). Recent theoretical developments allow for the possibility of small violations of the symmetrization postulate, which may lead in turn to detectable violations of the Pauli exclusion principle. We report the results of a new experimental search for paronic (Pauli-violating) Be, denoted by Be', in samples where Be' retention would be…
Two exotic particle searches are being performed using the Accelerator Mass Spectrometer (AMS) at the Purdue Rare Isotope Measurement Laboratory (PRIME Lab). Recent theoretical developments allow for the possibility of small violations of the symmetrization postulate, which may lead in turn to detectable violations of the Pauli exclusion principle. We report the results of a new experimental search for paronic (Pauli-violating) Be, denoted by Be', in samples where Be' retention would be highest. Our limits represent an improvement by a factor of approximately 300 over a previous search for Be'. There are also several recent cosmological motivations for strongly interacting massive particles (SIMPs). We present results from our current search for anomalous heavy isotopes of Au in samples of Australian and laboratory gold with a limit on SIMP abundance ratios as low as 10~12. This experiment provides significant constraints on the existence of such particles in high Z nuclei.
Other authors -
New Experimental Limits on the Existence of Strongly Interacting Dark Matter Particles
XXXVIth Recontres de Moriond Conference Proceedings
We report on the results from an experimental search for strongly interacting massive dark matter particles bound to gold nuclei. A scan for heavy gold isotopes with masses ranging from 186.3 - 325.9 GeV/c^2 was performed on laboratory gold and gold from western Australia using PRIME Lab, the Purdue Accelerator Mass Spectrometer facility. If such particles exist in this mass range their abundances must be less than 10^-11-10^-10. These results provide significant new constraints on current…
We report on the results from an experimental search for strongly interacting massive dark matter particles bound to gold nuclei. A scan for heavy gold isotopes with masses ranging from 186.3 - 325.9 GeV/c^2 was performed on laboratory gold and gold from western Australia using PRIME Lab, the Purdue Accelerator Mass Spectrometer facility. If such particles exist in this mass range their abundances must be less than 10^-11-10^-10. These results provide significant new constraints on current models which predict the existance of such particles.
Other authors -
New Experimental Test of the Pauli Exclusion Principle Using Accelerator Mass Spectrometry
Physical Review Letters, Vol. 85, pp. 2701-2704
We report the results of a new experimental search for the Pauli-forbidden 1s4 state of Be, denoted by Be′. Using the Accelerator Mass Spectrometer facility at Purdue University, we set limits on the abundance of Be′ in metallic Be, Be ore, natural gas, and air. Our results improve on those obtained in a previous search for Be′ by a factor of approximately 300.
Other authorsSee publication -
Velocity Pointing Errors Associated with Spinning Thrusting Spacecraft
Journal of Spacecraft and Rockets, Vol. 37, pp. 359-365
Because of the imperfection of spacecraft assembly, there always exist misalignment and offset torques during thrust maneuvers. In the case of an axially thrusting spin-stabilized spacecraft, these torques disturb the angular momentum vector in inertial space causing a velocity pointing error. Much insight can be gained by analytically solving the problem of time-varying torques and time-varying moments of inertia. We use approximate analytic solutions to suggest how the velocity pointing error…
Because of the imperfection of spacecraft assembly, there always exist misalignment and offset torques during thrust maneuvers. In the case of an axially thrusting spin-stabilized spacecraft, these torques disturb the angular momentum vector in inertial space causing a velocity pointing error. Much insight can be gained by analytically solving the problem of time-varying torques and time-varying moments of inertia. We use approximate analytic solutions to suggest how the velocity pointing error can be reduced for some practical assumptions based on current technology. For example, in the case of solid rocket motors, a significant improvement in velocity pointing can be realized by judicious distribution of the propellant.
Other authors -
Testing the Pauli Exclusion Principle With Accelerator Mass Spectrometry
American Institute of Physics Conference Proceedings, Vol. 545, pp. 288-294
We report the results of a new experimental search for the Pauli-forbidden Is4 state of Be, denoted by Be7. Using the Accelerator Mass Spectrometer facility at Purdue, we set limits on the abundance of Be' in metallic Be, Be ore, natural gas, and air (10~14). Our results improve on those obtained in a previous search for Be7 by a factor of approximately 300.
Other authors
Patents
-
Method for Velocity Precision Pointing in Spin-Stabilized Spacecraft or Rockets
Issued US US 6,332,592 B1
A method of velocity precision pointing of spin-stabilized spacecraft or rockets is disclosed. This method involves softening the ignition transient of the ramp up phase of thrust, which may be achieved by modification of the solid propellant, applying a coating of slow burning material upon the solid propellant, varying the grain denisty of the solid propellant over an exposed surface area or as a function of propellant depth, where depth is defined in the direction of the burn surface area…
A method of velocity precision pointing of spin-stabilized spacecraft or rockets is disclosed. This method involves softening the ignition transient of the ramp up phase of thrust, which may be achieved by modification of the solid propellant, applying a coating of slow burning material upon the solid propellant, varying the grain denisty of the solid propellant over an exposed surface area or as a function of propellant depth, where depth is defined in the direction of the burn surface area regression, pre-pressurizing the combustion chamber with a gas having a molecular mass smaller than that of the combustion products and which is unreactive with the propellant, installing one or more relief valvues in the combustion chamber to regulate the pressure, forming a ring of ablative material on the throat section of the nozzle, having a rubber fitting at the throat section of the nozzle to regulate the pressure of the combustion chamber, pulsing the engine, or using a pyrogen igniter to soften the ignition transient.
Other inventors
More activity by Dan "Animal"
-
TONIGHT: Artificial Intelligence in Military Aviation at 7pm MST It’s time to go…Behind the Wings! Link in comments to Episode 1, produced by Wings…
TONIGHT: Artificial Intelligence in Military Aviation at 7pm MST It’s time to go…Behind the Wings! Link in comments to Episode 1, produced by Wings…
Liked by Dan "Animal" Javorsek, PhD
-
From Senior Space Force Officer Jason Lowery I don't think anyone grasps how transformational a technology Bitcoin is
From Senior Space Force Officer Jason Lowery I don't think anyone grasps how transformational a technology Bitcoin is
Liked by Dan "Animal" Javorsek, PhD
-
Aviation Week Network received a rare glimpse inside the U.S. Navy's plans for the secretive F/A-XX program, the future replacement for the F/A-18E/F…
Aviation Week Network received a rare glimpse inside the U.S. Navy's plans for the secretive F/A-XX program, the future replacement for the F/A-18E/F…
Liked by Dan "Animal" Javorsek, PhD
-
WORLD PREMIERE: Join PBS and Wings Over the Rockies Air & Space Museum as they give an inside look at the United States Air Force Test Pilot…
WORLD PREMIERE: Join PBS and Wings Over the Rockies Air & Space Museum as they give an inside look at the United States Air Force Test Pilot…
Liked by Dan "Animal" Javorsek, PhD
-
#Bitcoin is a national strategic priority. Great points from Jason Lowery. LFG!
#Bitcoin is a national strategic priority. Great points from Jason Lowery. LFG!
Liked by Dan "Animal" Javorsek, PhD
-
✨The U.S. Department of Energy (DOE) took a major step in supporting #AI advancement with Oak Ridge National Laboratory leading the way. DOE…
✨The U.S. Department of Energy (DOE) took a major step in supporting #AI advancement with Oak Ridge National Laboratory leading the way. DOE…
Liked by Dan "Animal" Javorsek, PhD
-
How an F-16 fighter pilot feels after flying with the newest version of Northrop Grumman's Integrated Viper Electronic Warfare Suite (IVEWS). See why…
How an F-16 fighter pilot feels after flying with the newest version of Northrop Grumman's Integrated Viper Electronic Warfare Suite (IVEWS). See why…
Liked by Dan "Animal" Javorsek, PhD
-
WHEN, not if: #DoD will likely enforce that all developers use memory safe languages for #Defense software. Expect a mass migration from C/C to…
WHEN, not if: #DoD will likely enforce that all developers use memory safe languages for #Defense software. Expect a mass migration from C/C to…
Liked by Dan "Animal" Javorsek, PhD
-
Eight weeks ago I was in the United States, listening to JD Vance, Elon Musk, and many other amazing people at the All-In Summit. Soaking up the…
Eight weeks ago I was in the United States, listening to JD Vance, Elon Musk, and many other amazing people at the All-In Summit. Soaking up the…
Liked by Dan "Animal" Javorsek, PhD
-
Unity. Because at the end of the day, we're all on the same side... #aviation Us Navy Blue Angels United States Air Force #aviationindustry…
Unity. Because at the end of the day, we're all on the same side... #aviation Us Navy Blue Angels United States Air Force #aviationindustry…
Liked by Dan "Animal" Javorsek, PhD
-
AEVEX Aerospace is #hiring for a multitude of jobs in Engineering, Growth, Programs, and Contracts across the United States. We are always looking…
AEVEX Aerospace is #hiring for a multitude of jobs in Engineering, Growth, Programs, and Contracts across the United States. We are always looking…
Liked by Dan "Animal" Javorsek, PhD
-
"𝗔 𝗠𝗲𝘁𝗵𝗼𝗱𝗼𝗹𝗼𝗴𝘆 𝗳𝗼𝗿 𝗧𝗵𝗲𝗿𝗺𝗮𝗹 𝗟𝗶𝗺𝗶𝘁 𝗕𝗶𝗮𝘀 𝗣𝗿𝗲𝗱𝗶𝗰𝘁𝗮𝗯𝗶𝗹𝗶𝘁𝘆 𝘁𝗵𝗿𝗼𝘂𝗴𝗵 𝗔𝗿𝘁𝗶𝗳𝗶𝗰𝗶𝗮𝗹…
"𝗔 𝗠𝗲𝘁𝗵𝗼𝗱𝗼𝗹𝗼𝗴𝘆 𝗳𝗼𝗿 𝗧𝗵𝗲𝗿𝗺𝗮𝗹 𝗟𝗶𝗺𝗶𝘁 𝗕𝗶𝗮𝘀 𝗣𝗿𝗲𝗱𝗶𝗰𝘁𝗮𝗯𝗶𝗹𝗶𝘁𝘆 𝘁𝗵𝗿𝗼𝘂𝗴𝗵 𝗔𝗿𝘁𝗶𝗳𝗶𝗰𝗶𝗮𝗹…
Liked by Dan "Animal" Javorsek, PhD
Other similar profiles
Explore collaborative articles
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
Explore More