全國中小學科展

美國

The Characterization of Human Epidermal Stem Cells

The role of Notch signaling in the regulation of growth and differentiation of epithelial stem cells is poorly understood. While specific markers for epidermal stem cells have not yet been identified, members the Notch signaling pathway have been reported to be differentially expressed in the human epidermis. This study sought to demonstrate the presence and distribution of Notch and its ligands, Delta and Jagged, in human keratinocytes, and thereby characterize this subpopulation. Human neonatal foreskin samples were used to obtain isolated epidermal cells. Cells that were shown to be negative for connexin43, a gap junction protein, and positive for keratin14, a basal marker, were classified as presumptive stem cells (PSC). This sorted subpopulation was shown to be small and agranular by flow cytometry analysis. After two weeks in cell culture, PSC revealed a proliferative potential three times greater than non-sorted cells. The PSC exhibited increased expression of Delta and Jagged ligands than the general population. Additionally, RT-PCR confirmed the presence of Jagged and Delta in keratinocytes; however, only Jagged was detected in immunohistochemistry tests. Members of the Notch family were identified by immunohistochemistry in the epithelium and also at the protein- and mRNA-level. The data suggests that variations in the expression of members of the Notch signaling pathway could potentially be used as markers for stem cells of the epithelium; however, further research is necessary to make definitive conclusions, which would provide better insight into Notch regulatory pathways. This understanding could one day allow for the eventual treatment of epithelial damage caused by various skin diseases, injuries, or burns.

Antimicrobial and Heavy Metal Sequestration Capacities of Graphene Polymer Nanofilms

Membrane bioreactors (MBR) are important components in the production of effluent in wastewater treatment systems. However, MBR are susceptible to biofouling, a process by which bacteria colonize the surface of the membrane in contact with water. Graphene could be a solution to biofilm formation. In this study, the graphene polymer nanocomposite’s antimicrobial and heavy metal removal properties and the mechanisms behind the properties were investigated. Five different films of nanocomposites with a form of graphene and a polymer were synthesized: Graphene, Graphene Oxide, PVK-GO, PVK-G, PVK. A Büchner funnel and a vacuum pump were used to coat membrane filters with solutions of each nanomaterial. Using the Büchner funnel, E. coli and B. subtilis bacteria were filtered through the filter and both the filtrate and the filter were examined for bacterial content. Similarly, a Pb2+ solution was filtered through the coated filters and percentage removal of the ion was calculated using Atomic Absorbtion Spectrometry. Further analysis from SEM data, ATR-IR, and an Oxidative Stress test revealed that the PVK-GO nanocomposite inactivates bacteria by causing oxidative stress and the carboxyl group binds to lead ions. PVK-GO was most effective at removing the highest percentage of heavy metal and inactivated the most bacteria and displayed the most antimicrobial properties. PVK-GO coatings provide an efficient and economical alternative to the current wastewater industry standard and can save millions of dollars and reduce environmental waste. Also, the coatings have applications in indwelling medical devices and can reduce the risks associated with biofilm formational and bacterial infections.

Carbon Nanostructures Via Dry Fce Exposed to High Temperature

This science project is designed to answer a question of whether or not a chemical reaction is needed to produce industrial quantities of carbon nanostructures by exposing dry ice to a high temperature that is at least 3100°C. A small carbon arc furnace powered by an electric welder is used to produce the high temperature. During control runs, the carbon arc furnace is energized for a predetermined time, after which the carbon arc furnace is de-energized and any carbon particles within the furnace are collected. During carbon nanostructures synthesis runs, dry ice is placed within the carbon arc furnace. The carbon arc furnace is energized and the dry ice is consumed for the predetermined time. Carbon nanostructures synthesized during the synthesis runs are collected once the carbon arc furnace is de-energized and allowed to cool. The volume of the carbon particles collected during the control runs is compared to the volume of the carbon nanostructures produced by the synthesis runs. This science project has discovered that on average at least 16 times more carbon nanostructures are produced during synthesis runs consuming dry ice as opposed to the control runs. Moreover, the synthesis runs did not rely on chemical reactions. Further still, samples of the synthesized carbon nanostructures were imaged using a transmission electron microscope (TEM). The TEM images clearly show high-quality carbon nanostructures that include carbon nanotubes, faceted carbon nanospheres, and the super-material graphene.

A Novel Spectroscopic-Chemical Sensor Using Photonic Crystals

Detection of harmful chemicals used in industrial complexes is crucial in order to create a safer environment for the workers. Presently, most chemical detectors used in workplaces are expensive, inefficient, and cumbersome. In order to address these deficiencies, a novel sensor was fabricated to produce a unique spectroscopic fingerprint for various toxic chemicals. The sensor was fabricated by depositing several layers of silica spheres (diameter ~250 nm) on a glass substrate using evaporation-based self assembly. As the spheres assemble to form a photonic crystal, they also create void (i.e., air) spaces in between them. Once the spheres assemble as a photonic crystal, a spectrometer was used to monitor the reflectivity. The spectrum had a high reflectivity at a specific wavelength, which is governed by the average index of refraction between the spheres and the void spaces. As a foreign chemical infiltrates into the photonic crystal, it occupies the void space, which results in an increase of the average index of refraction of the structure. Consequently, the peak wavelength of the reflectivity spectrum red-shifts, which then confirms the presence of a foreign substance. While the as-grown photonic crystal is able to detect chemicals, it is unable to differentiate between chemicals that have similar indices of refraction, such as ethanol and methanol. In order to detect chemicals with similar indices of refraction, five pieces of a single photonic crystal (i.e. five pixel device) were exposed to different silanes, which changed the surface chemistry of the silica spheres in the photonic crystal. In turn, the five pixel device was able to produce a unique chemical fingerprint for several chemicals, which can be calibrated to detect toxins in the workplace.

Parallax Modelling of OGLE Microlensing Events

We present a study using microlensing event data from the Optical Gravitational Lensing Experiment (OGLE), recorded in the period 2002-2016 from the Galactic bulge. Our two algorithms are based on the standard point-source-point-lens (PSPL) model, and on the less conventional parallax model respectively. The optimal fit was found for each sample event in the chi-square optimization algorithm, along with the best fit parameters. Out of the 7 best fits, 4 show strong parallax effect. The microlensing fit parameters were then cross-matched with proper motion data from the Naval Observatory Merged Astrometric Dataset (NOMAD), to obtain lens mass estimation for four events. These were estimated to 0.447 solar masses, 0.269 solar masses, 0.269 solar masses and 17.075 solar masses respectively. All masses were within the microlensing mass interval for lenses found in similar studies. In this study, we conclude that the parallax model often better describe long events and demonstrate the importance of utilizing both PSPL fits and parallax fits, instead of only the PSPL model. By varying only 2 of the 7 parallax microlensing parameters instead of all simultaneously, we obtain plausible values for lens direction and lens transverse velocity: a method to investigate microlensing lens properties with no regard to its luminosity. In addition, we also present spectral classes of the NOMAD objects associated with each event, which is vital for future investigations to further confirm mass estimations. We present strategies to further enhance the algorithm to analyze the microlensing event light curve to better find deviations. We also conclude that our double model can potentially unveil the presence of dim lens objects (MACHOs) such as brown dwarfs, exoplanets or black holes.

Geographic Belts for Hurricane Landfall Location Prediction

When predicting a hurricane’s landfall location, small improvements in accuracy result in large savings of lives, property, and money. The project’s purpose was to apply a breakthrough method that can predict the geographic location of a hurricane’s landfall with high accuracy. Researchers have known for a long time that there are strong correlations between a hurricane’s landfall location and the geographic regions its track passes through. However, no methods have been developed to mathematically and explicitly describe these correlations. Consequently, the correlations can only serve to meteorologists as vague guidelines for their guestimates and are not usable in making practical forecasts. By studying the correlations and performing numerical optimization on historical hurricane data, this research discovered a set of geographic belt regions in the Gulf of Mexico that can be used as landfall location predictors. When a hurricane passes through any one of these belt lines, a prediction can be made by extending the hurricane’s moving direction vector towards land – the intersection point of this extension line with the coastline is the predicted landfall location. This prediction method is simple and straightforward. It only uses basic measurements from meteorological satellites: the hurricane’s real-time locations and moving directions. In conclusion, when compared to existing methods, the predictive belt method (PBM) created in this research provides a landfall location forecast with higher accuracy. Verification with historical hurricane data demonstrated that the PBM’s average error is less than 50% of the National Hurricane Center models’ error.

Automated Illustration of Text to Improve Semantic Comprehension

Millions of people worldwide suffer from aphasia, a disorder that severely inhibits language comprehension. Medical professionals suggest that individuals with aphasia have a noticeably greater understanding of pictures than of the written or spoken word. Accordingly, we design a text-to-image converter that augments lingual communication, overcoming the highly constrained input strings and predefined output templates of previous work. This project offers four primary contributions. First, we develop an image processing algorithm that finds a simple graphical representation for each noun in the input text by analyzing Hu mo-ments of contours in images from The Noun Project and Bing Images. Next, we construct a da-taset of 700 human-centric action verbs annotated with corresponding body positions. We train support vector machines to match verbs outside the dataset with appropriate body positions. Our system illustrates body positions and emotions with a generic human representation created using iOS’s Core Animation framework. Third, we design an algorithm that maps abstract nouns to concrete ones that can be illustrated easily. To accomplish this, we use spectral clustering to iden-tify 175 abstract noun classes and annotate these classes with representative concrete nouns. Fi-nally, our system parses two datasets of pre-segmented and pre-captioned real-world images (Im-ageClef and Microsoft COCO) to identify graphical patterns that accurately represent semantic relationships between the words in a sentence. Our tests on human subjects establish the system’s effectiveness in communicating text using im-ages. Beyond people with aphasia, our system can assist individuals with Alzheimer’s or Parkin-son’s, travelers located in foreign countries, and children learning how to read.

Lunar Tide Contribution to Thermosphere Weather

Internet search technology is a pervasively used utility that relies on techniques from the _eld of spectral graph theory. We present a novel spectral approach to investigate an existing problem: the critical group of the line graph has been characterized for regular nonbipartite graphs, but the general regular bipartite case remains open. Because of the ine_ectiveness of previous techniques in regular bipartite graphs, our approach provides a new perspective and aims to obtain the relationship between the spectra of the Laplacians of the graph G and its line graph bG. We obtain a theorem for the spectra of all regular bipartite graphs and demonstrate its e_ectiveness by completely characterizing the previously unknown critical group for a particular class of regular bipartite graphs, the incidence graphs of _nite projective planes with square order. This critical group is found to be Z2_(Z2q+2)q3􀀀1_(Zq2+q+1)q2+q􀀀1; where q is the order of the _nite projective plane.

Multiple Time-step Predictive Models for Hurricanes in the North Atlantic Basin Based on Machine Learning Algorithms

The cost of damage caused by hurricanes in 2017 is estimated to be over 200 billion dollars. Quick and accurate prediction of the path of a hurricane and its strength would be very valuable in alleviating these losses. Machine learning based prediction models, in contrast to models based on physics, have been developed successfully in many problem domains. A machine learning system infers the modeling function from a training dataset. This project developed machine learning based prediction models to forecast the path and strength of hurricanes in the North Atlantic basin. Feature analysis was performed on the HURDAT2 dataset, which contains paths and strengths of past hurricanes. Artificial Neural Networks (ANNs) and Generalized Linear Model (GLM) approaches such as Tikhonov regularization were investigated to develop nine hurricane prediction models. Prediction accuracy of these models was compared using a testing dataset, disjoint from the training dataset. The coefficient of determination and the mean squared error were used as performance metrics. Post-processing metrics, such as geodesic error in path prediction and the mean wind speed error, were also used to compare different models. TLS linear regression model performed the best of out the nine models for one and two time steps, while the ANNs made more accurate predictions for longer periods. All models predicted location and strength with greater than .95 coefficient of determination for up to two days. My models predicted hurricane path in under a second with accuracy comparable to that of current models.

Satellite Modeling of Wildfire Susceptibility in California Using Artificial Neural Networking

Wildfires have become increasingly frequent and severe due to global climatic change, demanding improved methodologies for wildfire modeling. Traditionally, wildfire severities are assessed through post-event, in-situ measurements. However, developing a reliable wildfire susceptibility model has been difficult due to failures in accounting for the dynamic components of wildfires (e.g. excessive winds). This study examined the feasibility of employing satellite observation technology in conjunction with artificial neural networking to devise a wildfire susceptibility modeling technique for two regions in California. Timeframes of investigation were July 16 to August 24, 2017, and June 25 to December 8, 2017, for the Detwiler and Salmon August Complex wildfires, respectively. NASA’s MODIS imagery was utilized to compute NDVI (Normalized Difference Vegetation Index), NDWI (Normalized Difference Water Index), land surface temperature, net evapotranspiration, and elevation values. Neural network and linear regression modeling were then conducted between these variables and ∆NBR (Normalized Burn Ratio), a measure of wildfire burn severity. The neural network model generated from the Detwiler wildfire region was subsequently applied to the Salmon August Complex wildfire. Results suggest that a significant degree of variability in ∆NBR can be attributed to variation in the tested environmental factors. Neural networking also proved to be significantly superior in modeling accuracy as compared to the linear regression. Furthermore, the neural network model generated from the Detwiler data predicted ∆NBR for the Salmon August Complex with high accuracy, suggesting that if fires share similar environmental conditions, one fire’s model can be applied to others without the need for localized training.