Next Article in Journal
Mapping the Evolution of eLearning from 1977–2005 to Inform Understandings of eLearning Historical Trends
Previous Article in Journal
Acknowledgement to Reviewers of Education Sciences in 2013
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluation of the Educational Impact of Participation Time in a Small Spacecraft Development Program

1
Department of Computer Science, University of North Dakota, 3950 Campus Road, Stop 9015, Grand Forks, ND 58202, USA
2
Department of Space Studies, University of North Dakota, 4149 University Ave., Stop 9008, Grand Forks, ND 58202, USA
*
Author to whom correspondence should be addressed.
Educ. Sci. 2014, 4(1), 141-154; https://doi.org/10.3390/educsci4010141
Submission received: 30 December 2013 / Revised: 17 February 2014 / Accepted: 26 February 2014 / Published: 18 March 2014

Abstract

:
The value of the duration of participation in a small spacecraft program has not previously been sufficiently characterized. This work seeks to determine whether most relevant benefits are received by participants quickly (suggesting that participant education would be best achieved by shorter duration exposure to multiple domains) or accrues over time (suggesting that prolonged work on a single project would be most beneficial). The experiences of the student participants in the OpenOrbiter Small Spacecraft Development Initiative at the University of North Dakota are analyzed in an attempt to answer this question. To this end, correlation between the duration of program participation and the level of benefit received (across five categories) is assessed herein.

1. Introduction

With the growing prevalence of small spacecraft development programs as an educational experience to teach students about multiple science, technology education and math (STEM) disciplines, it is important to determine what the ideal duration of participation is to maximize student benefit. Practically, the answer to the question of whether benefit from participation accrues quickly at first and then the rate of additional gains diminishes, whether the rate of gain grows over time or whether it remains constant is sought. Alternately, it is possible that benefit is event- or experience-driven, meaning that there is no (or limited) correlation between the duration of participation and the level of benefit received. Instead benefit is received in chunks via participation in key milestones and events.
This paper begins to answer this question via assessing whether correlation exists between the duration of participation and the level of benefit received in each of five metrics (and one combined metric) for participants in the University of North Dakota’s OpenOrbiter Small Spacecraft Development Initiative (OOSDI). OOSDI is a CubeSat program that aims to demonstrate the efficacy of the Open Prototype for Educational NanoSats (OPEN) designs [1], which seek to enable the construction of a 1-U CubeSat-class spacecraft for under USD $5,000 [2]. The first results from an ongoing longitudinal study that will assess the benefits received by participants over time are presented herein.

2. Background

The work presented herein builds on existing work on project-based learning and small spacecraft development. This section presents an overview of relevant prior work on project-based learning, small spacecraft development as well as an overview of the OpenOrbiter program and prior work on its assessment.

2.1. Project-based Learning

Project-based learning (PBL), also known as problem-based learning, is a technique where students learn by doing. While the concept is by no means new (as the apprenticeship style of learning has been used throughout history [3,4]), it is seen as a departure from the traditional lecture-based style of instruction. The benefits are seen by some [5] as so great as to have an effect on national competitiveness on an international scale. The development of small spacecraft and CubeSats provides students with PBL-style educational benefits [6,7,8] in their discipline of participation. PBL has been shown to be effective in a diverse number of disciplines relevant to small spacecraft development, including aerospace [9], computer [10], electrical [11] and mechanical [12] engineering, computer science [13], project management [14] and entrepreneurship [15]. It has been shown to be effective across a wide range of educational and age levels [8,16]. It has also been shown to deliver benefits in addition to driving learning about course topics. These include improved student self-image [17], creativity [18], motivation [17], material understanding [19], workforce preparation [19], job placement [20] and academic program [21] and knowledge retention [22].
In the university context, PBL can occur in several formats. Students may engage in PBL activities as part of a regular course, such as a course project [8] or PBL-style course. They may participate as part of an independent or directed study [23] or to satisfy a senior design requirement [24]. They may also participate for extracurricular educational enrichment [23].

2.2. Small Spacecraft Development

Small spacecraft come in many varieties. In fact, the exact definition of what is a small spacecraft is elusive. Prefixes have been defined [25] to classify types of spacecraft; however, there is no line defining where small ends and larger sizes begin. Swartwout [26,27] proffers that size isn’t the defining attribute. Instead, he suggests that so-called “university class” spacecraft should be defined by their educational missions, risk tolerance and the ability to serve as a testing bed for out-of-the-box concepts. The CubeSat is one form factor that is commonly used for university-class spacecraft. Developed initially by Bob Twiggs and Jordi Puig-Suari as a tool to facilitate aerospace engineering education [28], CubeSats are now widely used by education [29,30] as well as being developed for science [31,32,33,34], government [35], military [36,37] and commercial [38,39] purposes. Their development is being aided [40] by the availability of free-to-developer launch services from the U.S. Air Force [41], NASA [42] and the ESA [43]. Lower-cost commercial launches are also on the horizon [44,45]. Low-cost development approaches, such as OPEN [2], are also enabling adoption via reducing the cost of spacecraft development. In 2013, 30 academic and 50 non-academic CubeSats have been manifested and over 100 institutions have participated in the development of a CubeSat-class spacecraft [30].

2.3. Related Remote Sensing Activities

Spacecraft are one mechanism for conducting remote sensing. Several other remote sensing educational activities also bear mention. Rundquist and Vandeberg [46] showed how in-the-field exercises on the ground could be utilized to drive student understanding, learning and excitement, without the need for a launch. Saad [47] and Jackson, et al. [48], conversely, showed how a low-cost (compared to a satellite launch) high altitude balloon launch could generate excitement for K-12 students and college undergraduates. Nordlie, et al. [49] demonstrated how an even lower-cost solar balloon (which can only reach 60,000 feet, as compared to the approximate 100,000 foot ceiling of helium high altitude balloons) can be used to generate similar excitement for under ten-percent of the cost. Mountrakis, et al. [50] quantified the impact of a project-based remote sensing activity on student performance on post-project exercises, showing a possible impact on the performance of lower-achieving students. They also noted qualitative benefits.

2.4. The OpenOrbiter Program

The OpenOrbiter Small Spacecraft Development Initiative commenced in 2012, following a thematically-related program. Its goals, name and logo are all student-developed. It has involved both STEM and non-STEM students from across the campus of the University of North Dakota [8]. OpenOrbiter seeks to demonstrate the efficacy of the designs [1] of the Open Prototype for Educational NanoSats via development, launch and on-orbit operations of an OPEN-based spacecraft. OPEN facilitates the low-cost development of CubeSats via making complete designs, software, fabrication and testing instructions and other materials publically available. With OPEN, a CubeSat can be developed for a parts cost (excluding payload-specific components) of approximately $5,000 [2]. This is significantly less than the $40,000 or more that might be spent buying a one-time-use kit-based spacecraft or the $250,000 cost of developing the designs from scratch [40]. These lower cost levels will allow greater penetration of spacecraft development or spacecraft-based experiments into the educational systems of more affluent countries and enable spacecraft development in less-affluent ones [51].

2.5. Assessment of the Educational Value of the OpenOrbiter Program

Prior work on the assessment of the educational value of the OpenOrbiter program and the benefits it provides has been conducted [23]. The level of increase in student self-identified status in five metric areas was reported. These increases were also determined to be attributed to program participation by student participants. It was shown that comparative levels of overall benefit were enjoyed by both graduate and undergraduate students and this improvement was attributed to program participation. Team leads were shown to receive more (approximately double the level of) benefit from participation and those participating more were also shown to receive greater benefit. Some correlation between time participating (based on surveys taken at one time which recorded data about individuals that had participated for different lengths of time) and benefit level was also demonstrated. The percentage of participants in each category showing improvement was shown to have minimal correlation with the amount of time that individuals had been in the program. Excluding some outlying data, the level of attribution of benefit to the program was consistent across the participation durations. Only very limited correlation between grade-level, GPA and benefits was identified. Other work [52] identified the benefits that students sought from their participation in the program and quantified the level of interest among participants in receiving these benefits. This article expands this analysis to focus on the impact that the duration of participation has on the level of benefit received.

3. Experimental Goals and Design

Five key areas of focus were initially identified when attempting to ascertain the educational effectiveness of the OOSDI. These were technical skills in the area of focus, spacecraft design skills, space excitement, and skills and comfort of giving presentations. While additional areas of focus have subsequently been identified, only one survey has been conducted characterizing participant attainment of benefits in these areas, so they are not considered at this time.
Initially, a 26-question survey [23] was conducted. The first 12 questions collected demographic data about the participant and their involvement in OOSDI. This included information about their academic status (undergraduate vs. graduate, class, time in program, GPA). They were also asked about their participation in OOSDI (number of years participating, hours participating per week and whether they participated for academic credit or not). They were also asked whether they had previous involvement in spacecraft design and, if so, the type of involvement.
The second version of this survey asked 42 questions, including the 26 from the initial survey. The duration of participation was changed to semesters from years to avoid causing students confusion in answering this question. The data from the earlier survey was multiplied by two (as UND has two normal-year semesters and the program has had extremely minimal participation by students over the summer months) to allow the year and semester numbers to be utilized together. This treatment is consistent with the clarification provided regarding how to answer the previous year’s question. An additional option was added to the question regarding academic participation and the questions regarding whether the participant had received credit from participating and the type of for-credit participation were combined into a single question (including answer in the format “yes—type” and the answer “no”).
The majority of the survey is comprised of questions presented on a 9-point scale. In all cases 9 is the superior answer (indicating greater experience/knowledge/etc.) 5 is neutral and 1 indicates the inferior answer. Students were asked, on both surveys, to characterize their pre-participation status and current status with regards to each of the five metrics.
The data from these two survey forms has been utilized to assess the correlation between the duration of participation (in semesters) and the level of benefit attained. This level of benefit has been calculated in all cases by subtracting the pre-participation status level from the post-participation status level. In a very limited number of cases, this resulted in inexplicable negative values. These have been investigated (as discussed in [23]) and replaced with zeroes, as they appear (based on responses to the attribution questions) to represent clerical errors by participants; they are, otherwise, uninterpretable.

4. Results and Discussion

This section presents the process and results of the analysis of the data that was collected as described in the foregoing section. In addition to the five key metrics for which data was collected, an additional aggregate metric was created by adding the benefit attained from each of the other areas together. As a participant would not necessarily receive benefit in all areas, this combined metric may serve as more holistic view of the value to the participant. This analysis process commenced with the correlation between all respondents’ performance in each of the five key metrics and the combined aggregate metric. This data, which is based on 31 respondent surveys including nine masters’ students, twenty-one undergraduate students and one individual who did not respond as to his or her students status, is presented in Table 1. This data included responses from seventeen non-lead participants, thirteen lead-level participants and one individual who did not respond with regards to this question. Note that all analysis was performed in Microsoft Excel.
Table 1. Correlation between number of semesters involved in the program and the five assessed metrics.
Table 1. Correlation between number of semesters involved in the program and the five assessed metrics.
Technical SkillSpacecraft DesignLevel of ExcitementPresentation SkillsPresentation ComfortAggregate Improvement
0.260.17−0.130.240.210.22
From the data presented, it is clear that there is limited correlation between the metrics and the amount of time spent participating, when neglecting all other confounding factors. For four of the metrics, correlation levels of between 0.17 and 0.26 are reported; one (spacecraft design) reports a negative correlation level. The aggregate improvement correlation level is also below 0.25. As correlation levels are on a −1 to 1 scale (with −1 indicating perfect inverse correlation and 1 indicating perfect correlation), these values do not provide much support to the thesis that prolonged participation provides greater levels of benefit.
Because of the nature of student participation, however, some students will receive benefits at different rates. Possible confounding variables are now considered. In Table 2, the correlation process is separated into graduate or undergraduate status.
Table 2. Comparison of the correlation between number of semesters involved in the program and the five assessed metrics, for masters and undergraduate students.
Table 2. Comparison of the correlation between number of semesters involved in the program and the five assessed metrics, for masters and undergraduate students.
Technical SkillSpacecraft DesignLevel of ExcitementPresentation SkillsPresentation ComfortAggregate Improvement
Bachelors Students−0.12−0.21−0.240.100.04−0.17
Masters Students0.850.840.350.630.370.89
For the masters’ students, this data shows strong correlation in three areas (technical skills, spacecraft design and aggregate improvement), moderate correlation is also shown in presentation skills. The level of excitement and presentation comfort metrics, show greater correlation levels than with the non-separated data. However, for the bachelors’ students, limited positive and negative correlation data is still presented. For the purposes of assessing statistical significance, an improvement as a function of duration of participation value was created by dividing each value by the duration (in semesters) of participation. From this, the difference in terms of space excitement (0.04) was significant at p = 0.05 and technical skills (0.08) and aggregate improvement (0.07) were significant at p = 0.10. Spacecraft design (0.18), presentation skills (0.12) and presentation confidence (0.48) were not shown to be significantly different at either p = 0.05 or p = 0.10.
Next, the correlation assessment is performed separating participants and team leads. This data is presented in Table 3.
Table 3. Comparison of the correlation between the number of semesters involved in the program and the five assessed metrics between those serving in a participant and team lead role.
Table 3. Comparison of the correlation between the number of semesters involved in the program and the five assessed metrics between those serving in a participant and team lead role.
Technical SkillSpacecraft DesignLevel of ExcitementPresentation SkillsPresentation ComfortAggregate Improvement
Participant
—Bachelors
−0.12−0.24−0.260.270.19−0.17
Participant
—Masters
11N/AN/A11
Participant
—Combined
−0.09−0.19−0.230.280.21−0.11
Team Lead
—Bachelors
−0.110.00−0.49−0.66−0.32−0.34
Team Lead
—Masters
0.830.860.270.570.320.88
Team Lead
—Combined
0.720.63−0.030.130.210.52
Team leads, overall, show moderate correlation in technical skills, spacecraft design and aggregate improvement and limited correlation in presentation skills and comfort. More pronounced results are demonstrated by the masters-level students. Very limited negative correlation is shown for the level of excitement. Undergraduate participants, however, show limited positive or negative correlation for all metrics, while the very limited data set for masters-level participants shows perfect correlation.
As the amount of weekly involvement in the project may affect the level of benefits obtained, the five metrics are now correlated separated by the number of hours worked each week (one 8+ response has been removed as it represented insufficient data for analysis). This data is presented in Table 4.
Table 4. Comparison of the correlation between the number of semesters involved in the program and the five assessed metrics between those devoting between 1–3.99 h per week and those devoting 4–7.99 h per week.
Table 4. Comparison of the correlation between the number of semesters involved in the program and the five assessed metrics between those devoting between 1–3.99 h per week and those devoting 4–7.99 h per week.
Technical SkillSpacecraft DesignLevel of ExcitementPresentation SkillsPresentation ComfortAggregate Improvement
1–3.99—Bachelors0.02−0.19−0.140.310.19−0.04
1–3.99—Masters0.870.820.260.500.860.87
1–3.99—Combined0.300.08−0.270.360.390.27
4–7.99—Bachelors−0.53−0.33−0.54−0.56−0.25−0.60
4–7.99—Masters1.001.001.001.00N/A1.00
4–7.99—Combined0.330.32−0.050.04−0.240.18
While this segmentation produces 5 combined correlation values in the 0.3 to 0.4 range and 3 in the ±0.2–0.3 range, no clear trend emerges for the combined data. The masters’ students, however, show strong correlation in the majority of the metrics. The limited set of undergraduate high-commitment data appears to show a pronounced negative correlation; however, this is likely attributable to students entering and leaving this group between survey administrations (instead of the unrealistic conclusion that students were unlearning skills). It also may be indicative of students re-evaluating their own skill levels in light of a better understanding of the subject material. Next, the data was correlated segmented by whether the participants received academic credit for their participation or not. This data is presented in Table 5.
Table 5. Comparison of the correlation between the number of semesters involved in the program and the five assessed metrics between those participating for course credit and those not participating for course credit.
Table 5. Comparison of the correlation between the number of semesters involved in the program and the five assessed metrics between those participating for course credit and those not participating for course credit.
Technical SkillSpacecraft DesignLevel of ExcitementPresentation SkillsPresentation ComfortAggregate Improvement
Credit—Bachelors−0.23−0.32−0.080.080.25−0.24
Credit—Combined0.380.30−0.040.19−0.250.23
No Credit—Bachelors−0.25−0.62−0.40−0.13−0.32−0.48
No Credit—Masters0.790.820.630.53−0.350.82
No Credit—Combined0.03−0.14−0.120.120.430.10
Again, several combined correlation values in the 0.3 to 0.4 range are produced and one value in the 0.4–0.5 range is generated. Two values in the ±0.2 to 0.3 range are also indicated. However, no moderate or strong correlation values are indicated. Insufficient masters-level for-credit participants were present to allow reporting; however, the impact of this limited set on the combined metric is pronounced. There is also a clear difference between the masters and bachelors-level students in the no-credit category.
The majors of the participants may also have some impact on the correlation between the duration of participation and the level of benefit received. As the project is run out of the UND Computer Science Department, there has been consistent Computer Science student involvement throughout the project. Thus, correlation is now performed for only Computer Science students. This data is presented in Table 6.
Table 6. Correlation between number of semesters involved in the program and the five assessed metrics, for computer science students.
Table 6. Correlation between number of semesters involved in the program and the five assessed metrics, for computer science students.
Technical SkillSpacecraft DesignLevel of ExcitementPresentation SkillsPresentation ComfortAggregate Improvement
Computer Science0.310.14−0.210.180.170.20
Again, no strong trends are present. One correlation value in the 0.3 to 0.4 range and two in the ±0.2 to 0.3 range are produced. Finally, the correlation segmented by both the field of major (divided into computer science and others) and academic level (graduate or undergraduate) is calculated. This data is presented in Table 7.
Table 7. Correlation between the number of semesters participating and increase in the five assessed metrics, for computer science and non-computer science students at both the bachelors’ and masters’ levels.
Table 7. Correlation between the number of semesters participating and increase in the five assessed metrics, for computer science and non-computer science students at both the bachelors’ and masters’ levels.
Technical SkillSpacecraft DesignLevel of ExcitementPresentation SkillsPresentation ComfortAggregate Improvement
Bachelors
—Computer Science
−0.07−0.22−0.270.080.01−0.17
Masters
—Computer Science
0.950.880.070.420.340.95
Bachelors—Other−0.95−0.90N/AN/AN/A−1.00
Masters—Other0.900.890.58N/A0.580.86
This data indicates very strong correlation for both computer science and non-computer science masters-level students in the technical skill, spacecraft design and aggregate improvement categories. Moderate correlation is shown for non-computer science students in level of excitement and presentation comfort. The masters-level computer science students also produce one correlation value in the 0.3 to 0.4 range and one in the 0.4 to 0.5 range. Note that in several cases, data characteristics generated a divide-by-zero issue for the excel correlation function. These are indicated with a “N/A” in Table 7.
What is problematic in this data is the very strong negative correlation values reported for non-computer science undergraduates (−0.95, −0.9 and −1). As it seemed unlikely that this could be attributable to the nature of the program (e.g., students are gaining less value the longer they spend), this was investigated. A limited number of respondents and some respondents that achieved significant gain in a single semester as compared to other (distinctly different) respondents that achieved moderate levels of improvement over a longer period of time is attributable for this. This result is, thus, a quirk of the data and limited number of respondents in this category.

5. Conclusions and Future Work

This paper has considered the correlation between the duration of participation in the program and the level of value attained. For several groups of participants, this correlation was shown to exist. However, it was not shown to exist in the general case (attributable, as demonstrated, to the presence of confounding variables). There were also some groups where no correlation between benefit level and participation duration could be demonstrated. For graduate students, a very strong correlation level was shown to exist in the technical and spacecraft design skills categories. Strong correlation was shown when considering the master’s students as a group and this was even stronger when they were divided into computer science and non-computer science students. Presentation skills was shown to have a moderate level of correlation as a combined (all master’s students) group. This correlation couldn’t be assessed separately for the non-computer science master’s students and was not as pronounced when only the computer science master’s students were considered. Level of excitement and presentation comfort showed moderate correlation for the non-computer science masters students, and a lower level of correlation for the computer science master’s students. Both the computer science and non-computer science master’s students showed a strong (0.95 in the case of computer science master’s students) aggregate improvement correlation; this was also evidenced in the combined (all master’s students) data.
Team leads (which included both graduate and undergraduates) showed moderate correlation between the duration of participation and level of value attained in three areas (technical and spacecraft design skills) and aggregate improvement. However, beyond the team leads, no grouping could be found from the data elements collected that removed a confounding variable to demonstrate strong positive correlation between the duration of participation and the level of benefit attained. This could indicate that undergraduates do not gain significantly in additional benefit beyond some level of participation. It could also be a function of limited sample sizes. Perhaps there is some other confounding variable that was not identified or surveyed that, if the data was segmented by it, would allow this correlation to be demonstrated. Continued analysis of this topic will be a subject for future work.
The presence or absence of correlation is important for several reasons. First, the presence of correlation suggests the value of longer duration participation. This correlation was an expected outcome. Second, when data shows a significant difference when segmented over a variable, this identifies this is a factor in the level of benefit that students attain. Some of these (such as team lead status, for-credit participation status or the number of hours worked a week) can be controlled if greater levels of benefit are seen for one status than others. Others (such as the major or graduate versus undergraduate status) demonstrate what groups should be targeted for participation, as students in these groups receive particular benefit. The identification of groups (of all types) that do not perform as well as others also serves to focus attention on improving (or identifying non-modifiable limiting factors) the outcomes of the program for these groups. Continued analysis of what levels of benefit are generated and for what groups of students will serve as a subject for ongoing work. When this study was started, individual identifying information was not collected on the surveys, limiting their utility for individual-granularity-level long-term tracking. This, combined with the high level of churn (particularly with undergraduate participants, both overall and entering and exiting particular classifications), has resulted in data that seems to show a negative correlation with skill learning (which makes little sense). This same trend is not present in the masters-level students where participation has been more consistent (though attrition has been significant, movement in and out of classifications has not). Future work will therefore include the collection of identifying information to facilitate individual tracking.

Acknowledgments

Small satellite development work at the University of North Dakota is or has been supported by the North Dakota Space Grant Consortium, North Dakota NASA EPSCoR, the University of North Dakota Faculty Research Seed Money Committee, North Dakota EPSCoR (NSF Grant # EPS-814442), the Department of Computer Science, the John D. Odegard School of Aerospace Sciences and the National Aeronautics and Space Administration. The involvement of the numerous students and faculty from multiple disciplines in this project and the support of corporate sponsors (including GitHub, Inc. and Trollweb A/S of Norway) is gratefully acknowledged.

Author Contributions

Jeremy Straub was primarily responsible for experimental design and most data analysis. David Whalen contributed to experimental design and was responsible for human subject protection compliance.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Straub, J.; Korvald, C.; Nervold, A.; Mohammad, A.; Root, N.; Long, N.; Torgerson, D. OpenOrbiter: A low-cost, educational prototype CubeSat mission architecture. Machines 2013, 1, 1–32. [Google Scholar] [CrossRef]
  2. Berk, J.; Straub, J.; Whalen, D. The Open Prototype for Educational NanoSats: Fixing the Other Side of the Small Satellite Cost Equation. In Proceedings of the 2013 IEEE Aerospace Conference, Big Sky, MT, USA, 2–9 March 2013.
  3. Snell, K.D. The apprenticeship system in British history: The fragmentation of a cultural institution. Hist. Educ. Quart. 1996, 25, 303–321. [Google Scholar] [CrossRef]
  4. Elbaum, B. Why apprenticeship persisted in Britain but not in the United States. J. Econ. Hist. 1989, 49, 337–349. [Google Scholar] [CrossRef]
  5. Gilmore, M. Improvement of STEM education: Experiential learning is the key. Mod. Chem. Appl. 2013, 1. [Google Scholar] [CrossRef]
  6. Larsen, J.A.; Nielsen, J.D. Development of CubeSats in an educational context. In Proceedings of the 2011 5th International Conference on Recent Advances in Space Technologies (RAST), Istanbul, Turkey, 9–11 June 2011; pp. 777–782.
  7. Larsen, J.A.; Nielsen, J.F.D.; Zhou, C. Motivating students to develop satellites in problem and project-based learning (PBL) environment. Int. J. Eng. Pedagog. 2013, 3, 11–17. [Google Scholar]
  8. Straub, J.; Berk, J.; Nervold, A.; Whalen, D. OpenOrbiter: An interdisciplinary, student run space program. Advan. Educ. 2013, 2, 4–10. [Google Scholar]
  9. Saunders-Smits, G.N.; Roling, P.; Brügemann, V.; Timmer, N.; Melkert, J. Using the Engineering Design Cycle to Develop Integrated Project Based Learning in Aerospace Engineering. In Proceedings of the International Conference on InnovationPractice and Research in Engineering Education, Coventry, UK, 18–20 September 2012; pp. 18–20.
  10. Qidwai, U. Fun to learn: Project-based learning in robotics for computer engineers. ACM Inroads 2011, 2, 42–45. [Google Scholar] [CrossRef]
  11. Bütün, E. Teaching genetic algorithms in electrical engineering education: A problem-based learning approach. Int. J. Electr. Eng. Educ. 2005, 42, 223–233. [Google Scholar] [CrossRef]
  12. Robson, N.; Dalmis, I.S.; Trenev, V. Discovery Learning in Mechanical Engineering Design: Case-based Learning or Learning by Exploring? In Proceedings of the 2012 ASEE Annual Conference, San Antonio, TX, USA, 10–13 June 2012.
  13. Correll, N.; Wing, R.; Coleman, D. A one-year introductory robotics curriculum for computer science upperclassmen. IEEE Trans. Educ. 2013, 56, 54–60. [Google Scholar] [CrossRef]
  14. Pollard, C.E. Lessons learned from client projects in an undergraduate project management course. J. Inf. Syst. Edu. 2012, 23, 271–282. [Google Scholar]
  15. Okudan, G.E.; Rzasa, S.E. A project-based approach to entrepreneurial leadership education. Technovation 2006, 26, 195–210. [Google Scholar] [CrossRef]
  16. Mathers, N.; Goktogen, A.; Rankin, J.; Anderson, M. Robotic mission to mars: Hands-on, minds-on, web-based learning. Acta Astronaut. 2012, 80, 124–131. [Google Scholar] [CrossRef]
  17. Doppelt, Y. Implementation and assessment of project-based learning in a flexible environment. Int. J. Technol. Des. Educ. 2003, 13, 255–272. [Google Scholar] [CrossRef]
  18. Ayob, A.; Majid, R.A.; Hussain, A.; Mustaffa, M.M. Creativity enhancement through experiential learning. Adv. Nat. Appl. Sci. 2012, 6, 94–99. [Google Scholar]
  19. Simons, L.; Fehr, L.; Blank, N.; Connell, H.; Georganas, D.; Fernandez, D.; Peterson, V. Lessons learned from experiential learning: What do students learn from a practicum/internship? Int. J. Teach. Learn. High. Educ. 2012, 24, 325–334. [Google Scholar]
  20. Breiter, D.; Cargill, C.; Fried-Kline, S. An industry view of experiential learning. Hosp. Rev. 2013, 13, 75–80. [Google Scholar]
  21. Edwards, A.; Jones, S.M.; Wapstra, E.; Richardson, A.M. Engaging Students through Authentic Research Experiences. In Proceedings of The Australian Conference on Science and Mathematics Education (formerly UniServe Science Conference), Sydney, Australia, 26–28 September 2012.
  22. Bauerle, T.L.; Park, T.D. Experiential learning enhances student knowledge retention in the plant sciences. HortTechnology 2012, 22, 715–718. [Google Scholar]
  23. Straub, J.; Whalen, D. An assessment of educational benefits from the OpenOrbiter space program. Educ. Sci. 2013, 3, 259–278. [Google Scholar] [CrossRef]
  24. Thakker, P.; Swenson, G. Management and Implementation of a CubeSat Interdisciplinary Senior Design Course; Thakker, P., Shiroma, W., Eds.; Emergence of Pico- and Nanosatellites for Atmospheric Research and Technology Testing, American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2010; pp. 17–32. [Google Scholar]
  25. Wertz, J.R.; Everett, D.F.; Puschell, J.J. Space Mission Engineering: The New SMAD; Microcosm Press: Hawthorne, CA, USA, 2011. [Google Scholar]
  26. Swartwout, M. AC 2011–1151: Significance of Student-built Spacecraft Design Programs: It’s Impact on Spacecraft Engineering Education over the Last Ten Years. In Proceedings of the American Society for Engineering Education Annual Conference, Vancouver, BC, Canada, 26–29 June 2011.
  27. Swartwout, M. University-class Satellites: From Marginal Utility to “Disruptive” Research Platforms. In Proceedings of the 18th Annual AIAA/USU Conference on Small Satellites, Logan, UT, USA, 9–13 August 2004.
  28. Deepak, R.A.; Twiggs, R.J. Thinking out of the Box: Space science beyond the CubeSat. J. Small Satell. 2012, 1, 3–7. [Google Scholar]
  29. Swartwout, M. Cheaper by the Dozen: The Avalanche of Rideshares in the 21st Century. In Proceedings of the 2013 IEEE Aerospace Conference, Big Sky, MT, USA, 2–9 March 2013; pp. 1–12.
  30. Swartwout, M. The Long-threatened Flood of University-class Spacecraft (and CubeSats) Has Come: Analyzing the Numbers. In Proceedings of the 27th Annual AIAA/USU Conference on Small Satellites, Logan, UT, USA, 10–15 August 2013.
  31. Bergsrud, C.; Straub, J. A 6-U Commercial Constellation for Space Solar Power Supply to Other Spacecraft. In Proceedings of the 2013 Spring CubeSat Developers’ Workshop, San Luis Obispo, CA, USA, 24–26 April 2013.
  32. Padmanabhan, S.; Brown, S.; Kangaslahti, P.; Cofield, R.; Russell, D.; Stachnik, R.; Steinkraus, J.; Lim, B. A 6U CubeSat Constellation for Atmospheric Temperature and Humidity Sounding. In Proceedings of the AIAA/USU Conference on Small Satellites, Logan, UT, USA, 10–15 August 2013.
  33. Bailey, J.; Tsitas, S.; Bayliss, D.; Bedding, T. A CubeSat Mission for Exoplanet Transit Detection and Astroseismology. In Proceedings of The 6U CubeSat Low Cost Space Missions Workshop, Canberra, Australia, 17–18 July 2012.
  34. Chirayath, V.; Mahlstedt, B. HiMARC 3D-high-speed, Multispectral, Adaptive Resolution Stereographic CubeSat Imaging Constellation. In Proceedings of the AIAA/USU 2012 Small Satellite Conference, Logan, UT, USA, 13–16 August 2012.
  35. Noca, M.; Jordan, F.; Steiner, N.; Choueiri, T.; George, F.; Roethlisberger, G.; Scheidegger, N.; Peter-Contesse, H.; Borgeaud, M.; Krpoun, R. Lessons Learned from the First Swiss Pico-Satellite: SwissCube. In Proceedings of the AIAA/USU Conference on Small Satellites, Logan, UT, USA, 10–13 August 2009.
  36. Weeks, D.; Marley, A.B.; London, J., III. SMDC-ONE: An Army Nanosatellite Technology Demonstration. In Proceedings of the AIAA/USU Conference on Small Satellites, Logan, UT, USA, 10–13 August 2009.
  37. Abramowitz, L.R. US Air Force’s SMC/XR SENSE NanoSat Program. In Proceedings of the AIAA Space 2011 Conference & Exposition, Long Beach, CA, USA, 27–29 September 2011.
  38. Taraba, M.; Rayburn, C.; Tsuda, A.; MacGillivray, C. Boeing’s CubeSat TestBed 1 Attitude Determination Design and On-orbit Experience. In Proceedings of the AIAA/USU Conference on Small Satellites, Logan, UT, USA, 10–13 August 2009.
  39. Fitzsimmons, S.; Tsuda, A. Rapid Development Using Tyvak’s Open Source Software model. In Proceedings of the AIAA/USU Conference on Small Satellites, Logan, UT, USA, 10–15 August 2013.
  40. Straub, J. Cubesats: A Low-cost, Very High-return Space Technology. In Proceedings of the 2012 Reinventing Space Conference, Los Angeles, CA, USA, 7–11 May 2012.
  41. Hunyadi, G.; Ganley, J.; Peffer, A.; Kumashiro, M. The University Nanosat Program: An Adaptable, Responsive and Realistic Capability Demonstration Vehicle. In Proceedings of the 2004 IEEE Aerospace Conference, Big Sky, MT, USA, 6–13 March 2004; Volume 5.
  42. Skrobot, G.; Coelho, R. ELaNa–educational Launch of Nanosatellite: Providing Routine RideShare Opportunities. In Proceedings of the SmallSat Conference, Logan, UT, USA, 13–16 August 2012.
  43. European Space Agency Call for Proposals: Fly Your Satellite! Available online: http://www.esa.int/Education/Call_for_Proposals_Fly_Your_Satellite (accessed on 13 August 2013).
  44. Garvey, J.; Besnard, E. Development Status of a Nanosat Launch Vehicle. In Proceedings of the 40th AIAA/ASME/SAE/ASEE Joint Propulsion Conference & Exhibit, Fort Lauderdale, FL, USA, 11–14 July 2004.
  45. Milliron, R. Interorbital’s NEPTUNE Dedicated SmallSat Launcher: 2013 Test Milestones and Launch Manifest Update. In Proceedings of the 2013 Spring CubeSat Developers’ Workshop, San Luis Obispo, CA, USA, 24–26 April 2013.
  46. Rundquist, B.C.; Vandeberg, G.S. Fully engaging students in the remote sensing process through field experience. J. Geogr. 2013, 112, 262–270. [Google Scholar] [CrossRef]
  47. Saad, M. Providing Hands-on STEM Education with High Altitude Balloons in North Dakota. In Proceedings of the Academic High Altitude Conference, Upland, IN, USA, 27–28 June 2013.
  48. Jackson, K.; Fevig, R.; Seelan, S. North Dakota State-wide High Altitude Balloon Student Payload Competition. In Proceedings of the 3rd Annual Academic High Altitude Conference, Nashville, TN, USA, 27–29 June 2012.
  49. Nordlie, J.; Straub, J.; Theisen, C.; Marsh, R. Solar Ballooning: A Low-cost Alternative to Helium Balloons for Small Spacecraft Testing. In Proceedings of the AIAA Science and Technology Forum and Exposition (SciTech 2014), National Harbor, MD, USA, 13–17 January 2014.
  50. Mountrakis, G.; Triantakonstantis, D. Inquiry-based learning in remote sensing: A space balloon educational experiment. J. Geogr. High. Educ. 2012, 36, 385–401. [Google Scholar] [CrossRef]
  51. Straub, J.; Berk, J.; Nervold, A.; Korvald, C.; Torgerson, D. Application of Collaborative Autonomous Control and the Open Prototype for Educational NanoSats Framework to Enable Orbital Capabilities for Developing Nations. In Proceedings of the 64th International Astronautical Congress, Beijing, China, 23-27 September 2013.
  52. Straub, J.; Whalen, D. Student expectations from participating in a small spacecraft development program. Aerospace 2013, 1, 18–30. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Straub, J.; Whalen, D. Evaluation of the Educational Impact of Participation Time in a Small Spacecraft Development Program. Educ. Sci. 2014, 4, 141-154. https://doi.org/10.3390/educsci4010141

AMA Style

Straub J, Whalen D. Evaluation of the Educational Impact of Participation Time in a Small Spacecraft Development Program. Education Sciences. 2014; 4(1):141-154. https://doi.org/10.3390/educsci4010141

Chicago/Turabian Style

Straub, Jeremy, and David Whalen. 2014. "Evaluation of the Educational Impact of Participation Time in a Small Spacecraft Development Program" Education Sciences 4, no. 1: 141-154. https://doi.org/10.3390/educsci4010141

Article Metrics

Back to TopTop