Matching Items (43,913)
Description
Cancer diseases are among the leading cause of death in the United States. Advanced cancer diseases are characterized by genetic defects resulting in uncontrollable cell growth. Currently, chemotherapeutics are one of the mainstream treatments administered to cancer patients but are less effective if administered in the later stages of metastasis, and can result in unwanted side effects and broad toxicities. Therefore, current efforts have explored gene therapy as an alternative strategy to correct the genetic defects associated with cancer diseases, by administering genes which encode for proteins that result in cell death. While the use of viral vectors shows high level expression of the delivered transgene, the potential for insertion mutagenesis and activation of immune responses raise concern in clinical applications. Non-viral vectors, including cationic lipids and polymers, have been explored as potentially safer alternatives to viral delivery systems. These systems are advantageous for transgene delivery due to ease of synthesis, scale up, versatility, and in some cases due to their biodegradability and biocompatibility. However, low efficacies for transgene expression and high cytotoxicities limit the practical use of these polymers. In this work, a small library of twenty-one cationic polymers was synthesized following a ring opening polymerization of diglycidyl ethers (epoxides) by polyamines. The polymers were screened in parallel and transfection efficacies of individual polymers were compared to those of polyethylenimine (PEI), a current standard for polymer-mediated transgene delivery. Seven lead polymers that demonstrated higher transgene expression efficacies than PEI in pancreatic and prostate cancer cells lines were identified from the screening. A second related effort involved the generation of polymer-antibody conjugates in order to facilitate targeting of delivered plasmid DNA selectively to cancer cells. Future work with the novel lead polymers and polymer-antibody conjugates developed in this research will involve an investigation into the delivery of transgenes encoding for apoptosis-inducing proteins both in vitro and in vivo.
ContributorsVu, Lucas (Author) / Rege, Kaushal (Thesis advisor) / Nielsen, David (Committee member) / Sierks, Michael (Committee member) / Arizona State University (Publisher)
Created2011
Description
Phase contrast magnetic resonance angiography (PCMRA) is a non-invasive imaging modality that is capable of producing quantitative vascular flow velocity information. The encoding of velocity information can significantly increase the imaging acquisition and reconstruction durations associated with this technique. The purpose of this work is to provide mechanisms for reducing the scan time of a 3D phase contrast exam, so that hemodynamic velocity data may be acquired robustly and with a high sensitivity. The methods developed in this work focus on the reduction of scan duration and reconstruction computation of a neurovascular PCMRA exam. The reductions in scan duration are made through a combination of advances in imaging and velocity encoding methods. The imaging improvements are explored using rapid 3D imaging techniques such as spiral projection imaging (SPI), Fermat looped orthogonally encoded trajectories (FLORET), stack of spirals and stack of cones trajectories. Scan durations are also shortened through the use and development of a novel parallel imaging technique called Pretty Easy Parallel Imaging (PEPI). Improvements in the computational efficiency of PEPI and in general MRI reconstruction are made in the area of sample density estimation and correction of 3D trajectories. A new method of velocity encoding is demonstrated to provide more efficient signal to noise ratio (SNR) gains than current state of the art methods. The proposed velocity encoding achieves improved SNR through the use of high gradient moments and by resolving phase aliasing through the use measurement geometry and non-linear constraints.
ContributorsZwart, Nicholas R (Author) / Frakes, David H (Thesis advisor) / Pipe, James G (Thesis advisor) / Bennett, Kevin M (Committee member) / Debbins, Josef P (Committee member) / Towe, Bruce (Committee member) / Arizona State University (Publisher)
Created2011
Description
This dissertation creates models of past potential vegetation in the Southern Levant during most of the Holocene, from the beginnings of farming through the rise of urbanized civilization (12 to 2.5 ka BP). The time scale encompasses the rise and collapse of the earliest agrarian civilizations in this region. The archaeological record suggests that increases in social complexity were linked to climatic episodes (e.g., favorable climatic conditions coincide with intervals of prosperity or marked social development such as the Neolithic Revolution ca. 11.5 ka BP, the Secondary Products Revolution ca. 6 ka BP, and the Middle Bronze Age ca. 4 ka BP). The opposite can be said about periods of climatic deterioration, when settled villages were abandoned as the inhabitants returned to nomadic or semi nomadic lifestyles (e.g., abandonment of the largest Neolithic farming towns after 8 ka BP and collapse of Bronze Age towns and cities after 3.5 ka BP during the Late Bronze Age). This study develops chronologically refined models of past vegetation from 12 to 2.5 ka BP, at 500 year intervals, using GIS, remote sensing and statistical modeling tools (MAXENT) that derive from species distribution modeling. Plants are sensitive to alterations in their environment and respond accordingly. Because of this, they are valuable indicators of landscape change. An extensive database of historical and field gathered observations was created. Using this database as well as environmental variables that include temperature and precipitation surfaces for the whole study period (also at 500 year intervals), the potential vegetation of the region was modeled. Through this means, a continuous chronology of potential vegetation of the Southern Levantwas built. The produced paleo-vegetation models generally agree with the proxy records. They indicate a gradual decline of forests and expansion of steppe and desert throughout the Holocene, interrupted briefly during the Mid Holocene (ca. 4 ka BP, Middle Bronze Age). They also suggest that during the Early Holocene, forest areas were extensive, spreading into the Northern Negev. The two remaining forested areas in the Northern and Southern Plateau Region in Jordan were also connected during this time. The models also show general agreement with the major cultural developments, with forested areas either expanding or remaining stable during prosperous periods (e.g., Pre Pottery Neolithic and Middle Bronze Age), and significantly contracting during moments of instability (e.g., Late Bronze Age).
ContributorsSoto-Berelov, Mariela (Author) / Fall, Patricia L. (Thesis advisor) / Myint, Soe (Committee member) / Turner, Billie L (Committee member) / Falconer, Steven (Committee member) / Arizona State University (Publisher)
Created2011
Description
In semiconductor physics, many properties or phenomena of materials can be brought to light through certain changes in the materials. Having a tool to define new material properties so as to highlight certain phenomena greatly increases the ability to understand that phenomena. The generalized Monte Carlo tool allows the user to do that by keeping every parameter used to define a material, within the non-parabolic band approximation, a variable in the control of the user. A material is defined by defining its valleys, energies, valley effective masses and their directions. The types of scattering to be included can also be chosen. The non-parabolic band structure model is used. With the deployment of the generalized Monte Carlo tool onto www.nanoHUB.org the tool will be available to users around the world. This makes it a very useful educational tool that can be incorporated into curriculums. The tool is integrated with Rappture, to allow user-friendly access of the tool. The user can freely define a material in an easy systematic way without having to worry about the coding involved. The output results are automatically graphed and since the code incorporates an analytic band structure model, it is relatively fast. The versatility of the tool has been investigated and has produced results closely matching the experimental values for some common materials. The tool has been uploaded onto www.nanoHUB.org by integrating it with the Rappture interface. By using Rappture as the user interface, one can easily make changes to the current parameter sets to obtain even more accurate results.
ContributorsHathwar, Raghuraj (Author) / Vasileska, Dragica (Thesis advisor) / Goodnick, Stephen M (Committee member) / Saraniti, Marco (Committee member) / Arizona State University (Publisher)
Created2011
Description
The pace of technological development and the integral role technologies play in the lives of today's youth continue to transform perceptions and definitions of literacy. Just as the growth in completely online texts and the use of audio books and e-readers expands the definition of reading, digital platforms like blogs expand the notion of literary response and analysis. Responding to the complexities of literacy, this study examines the ways in which the literacy practice of blogging about young adult literature might elicit the active, intellectual orientation, or habits of mind, often sought in adolescent literacy development. Employing Gardner's Five Minds theory as an analysis tool and what Erickson calls "key linkages" as a framework, blog transcripts were read and coded. Those coded literacy acts were then linked to reveal any evidence of the creating, respectful, ethical, disciplined, and synthesizing habits of mind. From these overlays, empirical data tables emerged, accompanied by integrated case study narratives. Empirical data illustrate the aspects of the cases, and exposition provides a feature analysis of the habits of mind observed during blogging as a form of literary response to young adult literature. Results of this study suggest that bloggers writing about young adult books in a weblog environment reveal 1) some proficiency at synthesizing material, 2) a tendency to evaluate, 3) only moderate demonstration of the disciplined and respectful/ethical habits, 4) minimal evidence of the creating mind, and 5) moderate proficiency in basic transactional writing. Aligning with previous research, Talking with Our Fingertips illuminates possibilities for adopting pedagogical principles that provide student agency and potentially increase motivation and productivity.
ContributorsMiller, Donna L. (Donna Lynn) (Author) / Blasingame, James (Thesis advisor) / Chin, Beverly A (Committee member) / Marsh, Josephine P (Committee member) / Nilsen, Alleen P (Committee member) / Arizona State University (Publisher)
Created2011
Description
This study examined the effect of consuming pinto, black, and dark red kidney beans with white rice in comparison to a white rice only control meal on the glycemic response of adults with type 2 diabetes (T2D). These bean and rice combinations are part of many traditional diets. Seventeen subjects with T2D treated by diet and/or metformin were randomly assigned to 4 treatments: white rice (control), pinto beans/rice, black beans/rice, and dark red kidney beans/rice. All treatments were portioned by weight and matched for available carbohydrate content of ∼ 50 grams. Capillary whole blood samples were collected at baseline and at 30, 60, 90, 120, 150 and 180 minutes posttreatment and assessed for glucose concentration using the YSI Stat Plus Analyzer. Net change glucose responses were significantly lower for the pinto, black, and dark red kidney bean and rice meals than control at 90, 120 and 150 minutes posttreatment (P < 0.05). Incremental area under the curve (iAUC) values were also significantly reduced for the bean/rice meals containing pinto (P < 0.01) and black beans (P < 0.05) in contrast to the rice control. Results suggest that the combination of whole beans and rice may be beneficial to those with T2D to assist with blood glucose management.
ContributorsThompson, Sharon (Author) / Winham, Donna M (Thesis advisor) / Beezhold, Bonnie (Committee member) / Dixon, Kathleen (Committee member) / Arizona State University (Publisher)
Created2011
Description
Over the past forty years the nonprofit sector has experienced a steady rise in the professionalization of its employees and its operations. Some have argued that this trend is in large part a reaction to the requirements foisted upon the nonprofit sector through the passage of the Tax Reform Act of 1969. While some scholars have detailed a number of unintended consequences that have resulted from this trend toward professionalization, in general scholars and practitioners have accepted it as a necessary step along the path toward ensuring that service is administered in an accountable and responsible manner. I analyze the contemporary trend in professionalization of the nonprofit sector from a different angle--one which seeks to determine how the nonprofit sector came to problematize the nature of its service beginning in the early twentieth century, as well as the consequences of doing so, rather than reinforce the existing normative arguments. To this end, I employ an "analytics of government" from an ethical and political perspective which is informed by Michel Foucault's conception of genealogy, as well as his work on governing rationalities, in order to reveal the historical and political forces that contribute to the nonprofit sector's professionalization and that shape its current processes, institutions, and norms. I ultimately argue that these forces serve to reinforce a broad movement away from the charitable impulse that motivates individuals to engage in personal acts of compassion and toward a philanthropic enterprise by which knowledge is rationally applied toward reforming society rather than aiding individuals. This movement toward institutional philanthropy and away from individual charity supplants the needs of the individual with the needs of the organization. I then apply this analysis to propose an alternate governing model for the nonprofit sector--one that draws on Foucault's exploration of ancient writings on love, self-knowledge, and governance--in order to locate a space for the individual in nonprofit life.
ContributorsSandberg, Billie (Author) / Catlaw, Thomas J (Thesis advisor) / Denhardt, Janet V (Committee member) / Hall, John S. (Committee member) / Arizona State University (Publisher)
Created2011
Description
As a term and method that is rapidly gaining popularity, Building Information Modeling (BIM) is under the scrutiny of many building professionals questioning its potential benefits on their projects. A relevant and accepted calculation methodology and baseline to properly evaluate BIM's benefits have not been established, thus there are mixed perspectives and opinions of the benefits of BIM, creating a general misunderstanding of the expected outcomes. The purpose of this thesis was to develop a more complete methodology to analyze the benefits of BIM, apply recent projects to this methodology to quantify outcomes, resulting in a more a holistic framework of BIM and its impacts on project efficiency. From the literature, a framework calculation model to determine the value of BIM is developed and presented. The developed model is applied via case studies within a large industrial setting where similar projects are evaluated, some implementing BIM and some with traditional non-BIM approaches. Cost or investment metrics were considered along with benefit or return metrics. The return metrics were: requests for information, change orders, and duration improvements. The investment metrics were: design and construction costs. The methodology was tested against three separate cases and results on the returns and investments are presented. The findings indicate that in the tool installation department of semiconductor manufacturing, there is a high potential for BIM benefits to be realized. The evidence also suggests that actual returns and investments will vary with each project.
ContributorsBarlish, Kristen Caroline (Author) / Sullivan, Kenneth T. (Thesis advisor) / Kashiwagi, Dean T. (Committee member) / Badger, William W. (Committee member) / Arizona State University (Publisher)
Created2011
Description
An offender's expression of remorse plays an important role following relational transgressions, yet it is not well understood how the experience and expression of remorse relate to both victim responses to hurt and forgiveness in close relationships. This study uses a social functionalist framework to investigate the role of remorse in the forgiveness process and tests whether offender remorse experiences mediate the associations between victim responses to hurt and remorse expressions. Undergraduate participants (N=671) completed questionnaires about a time when they hurt a close relational partner and reported their partners' responses to hurt, their own experiences and expressions of remorse, and their perceptions of forgiveness. Results indicated that victims' sad communication positively predicted offenders' other-oriented and affiliation remorse experiences; victims' threatening communication positively predicted offenders' self-focused remorse experience; and victims' conciliatory communication and withdrawal positively predicted offenders' affiliation and self-focused remorse experiences. Results of the mediation analyses revealed that self-focused remorse fully mediated the relationship between victim threatening communication and low status behaviors; other-oriented remorse partially mediated the association between victim sad communication and apology/concern behaviors; and affiliation partially mediated the relationship between victim conciliatory communication and connection behaviors. Victims' withdrawal behaviors and offenders' use of compensation were not related. Finally, offenders' apology/concern and connection behaviors associated positively with perceptions of forgiveness, whereas low status behaviors negatively predicted forgiveness. Use of compensation following a hurtful event was not significantly related to forgiveness. Results are interpreted within the framework of evolutionary psychology and further validate the functional approach to studying emotion.
ContributorsGracyalny, Monica (Author) / Mongeau, Paul A. (Thesis advisor) / Guerrero, Laura K. (Committee member) / Shiota, Michelle N. (Committee member) / Arizona State University (Publisher)
Created2011
Description
Redundant Binary (RBR) number representations have been extensively used in the past for high-throughput Digital Signal Processing (DSP) systems. Data-path components based on this number system have smaller critical path delay but larger area compared to conventional two's complement systems. This work explores the use of RBR number representation for implementing high-throughput DSP systems that are also energy-efficient. Data-path components such as adders and multipliers are evaluated with respect to critical path delay, energy and Energy-Delay Product (EDP). A new design for a RBR adder with very good EDP performance has been proposed. The corresponding RBR parallel adder has a much lower critical path delay and EDP compared to two's complement carry select and carry look-ahead adder implementations. Next, several RBR multiplier architectures are investigated and their performance compared to two's complement systems. These include two new multiplier architectures: a purely RBR multiplier where both the operands are in RBR form, and a hybrid multiplier where the multiplicand is in RBR form and the other operand is represented in conventional two's complement form. Both the RBR and hybrid designs are demonstrated to have better EDP performance compared to conventional two's complement multipliers. The hybrid multiplier is also shown to have a superior EDP performance compared to the RBR multiplier, with much lower implementation area. Analysis on the effect of bit-precision is also performed, and it is shown that the performance gain of RBR systems improves for higher bit precision. Next, in order to demonstrate the efficacy of the RBR representation at the system-level, the performance of RBR and hybrid implementations of some common DSP kernels such as Discrete Cosine Transform, edge detection using Sobel operator, complex multiplication, Lifting-based Discrete Wavelet Transform (9, 7) filter, and FIR filter, is compared with two's complement systems. It is shown that for relatively large computation modules, the RBR to two's complement conversion overhead gets amortized. In case of systems with high complexity, for iso-throughput, both the hybrid and RBR implementations are demonstrated to be superior with lower average energy consumption. For low complexity systems, the conversion overhead is significant, and overpowers the EDP performance gain obtained from the RBR computation operation.
ContributorsMahadevan, Rupa (Author) / Chakrabarti, Chaitali (Thesis advisor) / Kiaei, Sayfe (Committee member) / Cao, Yu (Committee member) / Arizona State University (Publisher)
Created2011