This is a summary of replies to a posting asking for input for a course
on "research methods."  It is organized around the topics in the original
posting.

 1 Introduction
  1.1 General Comments
 2 Textbooks on Social Science Research Methods
  2.1 List of Suggested Texts
  2.2. Comments on Suggested Texts
 3 Supplemental Reading
  3.1 List of Suggested Readings
  3.2 Comments on Suggested Readings
    3.2.1 Data Analysis
    3.2.2 Research Methods
  3.3 Anti-positivist Supplemental Readings
    3.3.1 Suggested Anti-positivist Supplemental Readings
    3.3.2 Comments on Suggested Anti-Positivist Readings
 4 Guides or Aids for Writing Theses and Thesis Proposals
  4.1 Suggestions
 5 Jump Starting a Thesis
  5.1 Suggestions
 6 Data Analysis: (SAS) Texts and Data
  6.1 Appropriate Statistical Software
    6.1.1 SAS: Pro and Con
    6.2.2 Alternatives to SAS
  6.2 Suggested Texts
    6.2.1 Comments on Suggested Texts
  6.3 Data
  6.4 Should We Teach Data Analysis This Way?
    6.4.1 Is Statistics Dangerous
    6.4.2 Topics to Cover
 7 Topics and Issues in Social Science Research Methods Course
 8 My Plan for the Course
  8.1 The Data Analysis Lab
  8.2 Class Sessions
  8.3 Individual Work
 9 Conclusion

Since the summary is quite long, each section starts at the left
margin and is underlined with dashes to facilitate searches to topic
headings.  I have added my own comments and given credit where appropriate.
Text from my original posting is preceded by ">" and quotations from other
people are preceded by one or more vertical bars ("|").  I deliberately
reproduce the original questions from my original posting at the start of
each section so that people interested in a specific topic can go directly
there and see what the issue is.

I want to thank everyone who commented, I found your comments both useful
and fascinating.


1 Introduction
----------------

Here is my original posting:

>Hi,
>
>Pardon my posting of this to several lists.  I am casting my net widely
>in the hope of catching several fish.
>
>In the spring I will be teaching a research methods course to first
>year graduate students in a two-year community (urban) planning program
>leading to a professional masters degree (MCP).  The course has three main
>objectives: 1) having the students write an initial draft of their thesis
>proposals, 2) introducing them to social science research methods, and 3)
>learning sufficient computer data analysis skills to do data analysis for their
>theses.  There are several things I would like your input on.
>
>Textbooks on Social Science Research Methods
>--------------------------------------------
>
>I plan to use _Doing Urban Research_ by Gregory Andranovich and Gerry
>Riposa (Sage, 1993).  This book has several virtues.  It is short (98 p.
>of text), has class exercises, considers diverse theoretical orientations
>and considers non-positivist urban research (e.g., Mike Davis, Mark
>Gottdiener, etc.) to be today's mainstream, covers key issues (spatial
>levels of analysis, the unity of theory/basic/applied research, research
>design, data collection, qualitative research, and reporting research).
>Obviously, the coverage in such a short book is cursory.
>
>Still, I have some misgivings about the book.  It presents "the scientific
>method" as the standard Popperian account of deduction, hypothesis, etc.
>It emphasizes causality, and its notion of causality goes little beyond
>that of Hume and JS Mill.  Its abbreviated form and common-sense writing
>style may gloss over very complex issues while giving too brief treatment
>for students to master the material adequately.  Its treatment of epistemology
>and the philosophy of science is almost nil: I would like much more
>explicit discussion of the various epistemological "isms" (positivism, realism,
>subjectivism, rationalism, postmodernism, feminism, etc.).  The political
>implications of epistemological positions are never discussed.  Etc.
>Can anyone suggest another social science research methods text to use either
>instead or as a substitute?
>
>Supplemental Reading
>--------------------
>Partly to deal with these shortcomings, I want to use a "jigsaw" approach
>in class sections.  This teaching technique uses 3-4 supplemental readings
>per course period.  Students must read at least one of the readings, and
>in class they break up into groups in which the students collectively have
>read every reading.  They then share what they read with the other students
>in the group, and the group works together to answer a question or do a task.
>Then the groups report to the class, and class discussion focuses on the
>topic of the day.
>
>Different topics I'd like to treat this way include: the relation between
>theory, knowledge, and practice; causality; conceptions of science and
>human knowledge; the uses of research; the relation between theory and method;
>levels of analysis; defining research goals and objectives; literature
>reviews; research design; data collection; data sources; reporting research;
>design of research instruments; overviews of data analysis techniques; etc.
>Can anyone suggest readings that might be useful in this setting and for
>this purpose?
>
>Guides or Aids for Writing Theses and Thesis Proposals
>------------------------------------------------------
>Self-explanatory.
>
>
>Data Analysis: (SAS) Texts and Data
>-----------------------------------
>
>The class has a 1 1/2 hour per week computer lab which covers data analysis.
>In general we use SAS because it is available around the university,
>powerful, and widely used.  The text I've used in past years is _SAS
>System for Elementary Statistical Analysis_ by Sandra Schlotzhauer and
>Ramon Littell.  However, both SAS and this text have severe problems
>
>Although the students have all had one undergraduate statistics course and
>a introductory methods course that introduces them to computers via a
>wordprocessor (WordPerfect) and a spreadsheet (Quattro Pro), they are
>generally not very computer literate.  SAS is an programming language and
>foreign to most students' experience.  Also, SAS is a bit arcane and
>difficult to use.  For example, if a student leaves off a semicolon,
>SAS errors result but the error messages do little good.  I tried using
>SAS ASSIST, the menu-driven front end for SAS, but there is no good
>textbook on SAS ASSIST and statistical methods, the PC version of SAS
>ASSIST is not documented in any text, and the version on our mainframe
>looks nothing like the illustrations in the SAS ASSIST manuals unless you
>run it from a graphics or X terminal.  The Schlotzhauer book spends too
>much time on very elementary techniques and not enough on the more
>advanced techniques our students need (e.g., regression methods)
>
>Beyond this, there is a problem with using data for "realistic" problems.
>Most data suitable for "realistic" problems are unsuitable from a pedagogic
>standpoint.  For example, census data are aggregated, so regression using
>census data should correct for aggregation (e.g., using GLS).  I would like
>the textbook to be oriented towards "real" social science (preferably urban)
>data, either by providing the data or having the students get them, and to
>take students through analyzing the data, step by step.
>
>Finally, most of our students' interests have a spatial component.  It would
>be most desirable to have spatial data and to use SAS' various mapping
>and graphics features.  Of course, this would probably imply having the
>coordinates for geographic features.
>
>
>Jump Starting a Thesis
>----------------------
>
>Our curriculum does not give the students much exposure to substantive
>issues before they take this course.  Hence, asking them to prepare their
>thesis proposals really is sort of a dry run.  The best thing I can tell
>students preparing proposals is to immerse themselves in the literature on
>the topic.  Yet since this is a methods course, I really can't expect students
>to do literature reviews on low-income housing, economic development policy,
>etc.  Can anyone suggest a way to get students familiar with the research
>literature on a topic in very short order?
>
>
>Topics and Issues in Social Science Research Methods Course
>-----------------------------------------------------------
>What topics do you feel should be covered in a social science (planning)
>research methods course?  Why?  What is the best way to teach these?
>
>Please reply directly to me at the address below, as I don't ordinarily
>subscribe to all the lists I am sending this to.  I will summarize your
>responses and post them to the lists.
>
>If you want to open up these or related issues to wider discussion by posting
>them to the list, please be my guest (^-:.
>
>Thanks.

As a result of the comments I received, I revised the course objectives.
Since the students take this course when they are in their second semester
in our program, they have little or no exposure to substantive planning
issues, much less the literature on such issues, when they take the course.
Therefore, it is a bit too ambitious to expect workable thesis proposals,
and I made this an optional course assignment.  The focus is now (2), with
some minimal level of proficiency expected for (1) and (3).


1.1 General Comments
--------------------

As expected, several people made general comments about the course.
A general theme seemed to be that the course tries to do too much in one
semester, but I already knew this.  To me the more interesting question is
what is essential, what can be out, and how to cover the essential materials.

On the other hand, several people are teaching or designing similar courses.
Ed Waltz, at the University at Albany, School of Public Health, said,
"the similarities between your course and my preliminary thoughts are
striking... right down to including a computer lab component using SAS,
interest in Geographical data, etc.  Similarly, Alan Zaslavsky at
Harvard said, "Thanks for sharing your concerns, the problems you raise
are real ones."  Nor are these concerns confined to the social sciences.
Sam Scheiner writes:

|I face exactly the same situation that you do with my intro stats
|course for our biology grad students.  At a minimum, I hope that they will
|be able to read a journal article intelligently and when they come to me
|for help with their thesis that I can communicate to them what they should
|be doing, even if we covered it in class and they have since forgotten.

Neil Henry, from Virginia Commonwealth University, asked some fundamental
questions:

|        Why does this program require a thesis?
|        If a thesis, why does it need to be quantitative research based?
|        If it must involve quantitative data analysis, why must regression
|       analysis be the most common method of statistical data analysis?
|
|While the last question is usually easiest to answer, the answer must
|presuppose valid answers to (1) and (2). In a program that forces
|a three credit course to deal with philosophy of social research,
|statistical methods at the level of multiple regression, data
|collection and analysis principles, and development of computer
|knowledge sufficient to deal with spatial analysis, it seems to me
|that valid answers to (1) and (2) will not be easy to provide.

I wrote Neil the following answers:

>>      Why does this program require a thesis?

+It requires either a thesis or a "thesis project."  The latter is usually
+interpreted to mean research applied to a specific problem.  I really don't
+know all the reasons why a thesis or thesis project is required, although
+I'd probably come out in favor of something like a thesis myself.

One reason I think writing theses is a good idea is that it gives students
"inside" knowledge of academic research and thereby lets them know first-
hand its strengths and weaknesses.  Conversely, it also teaches them, by
implication, the weaknesses and strengths of the socially accepted
"knowledge" of everyday life.

>>      If a thesis, why does it need to be quantitative research based?

+It doesn't.  But my feeling is that it certainly won't be quantitative
+if we don't teach quantitative methods and their utility.  Also, the goals
+of the course and the curriculum (at least my personal interpretation of it)
+include making students intelligent "consumers" of quantitative research,
+so we would not be doing our job if we did not teach quantitative methods
+at least at this minimal level.

>>      If it must involve quantitative data analysis, why must regression
>>      analysis be the most common method of statistical data analysis?

+There's no reason why it must be.  On the other hand, I'd say it is in fact
+more common in my field than any other single quantitative method.  Also,
+I like regression for several reasons: linear models are intuitively easy
+to grasp; it interfaces with experimental design through statistical controls;
+most planning data emphasize continuous (rather than discrete or qualitative)
+variables; things like transformations are readily introduced as important
+to regression analysis; regression can be expanded to cover discrete and
+continuous variables through ANOVA and ANCOVA versions of GLM (not to mention
+logit analysis); regression raises all sorts of questions about causality v.
+correlation, underlying process v. empirical appearance; etc.

Even if the content of the course is justifiable, that does not mean the
structure is.  Neil goes on to offer the following observations:

|At least 6 and preferably 9 credits are necessary beyond the
|minimal undergraduate statistics and computer literacy courses you
|mention to provide the grounding you hope to provide. I say this
|after many years of teaching graduate students in a variety of
|Master's and doctoral programs clustered around sociology
|(social work, education, gerontology, public administration, etc.).
|If these programs do not demand prerequisite grounding in research
|methods, including computer facility, but feel that a research thesis
|is necessary, then more instructional time must be provided. In
|most cases, especially in professional programs, I believe that
|the emphasis on an original, academically respectable Masters'
|thesis is misguided.
|
|Which is to say that you, like many of us, have a hopeless
|task. Which we nevertheless continue to try to succeed at.
|But at the same time that we search for the perfect text, teaching
|method, and computer package, we must also attack the premises that
|created a program that dumped these expectations into a single course.

Neal does not say why he believes an "original, academically respectable
Masters' thesis is misguided", so I can't assess this point.  As my comments
above indicate, I think most of the course content is justifiable, even for
students who don't write theses.  Such students still must be intelligent
consumers of others' research, and they must be able to think
independently.  Learning about social research should help in both.  On the
other hand, I do agree with Neil's assessment of the impossibility of covering
sufficiently in a single semester everything that should be covered.

Neal has some good ideas about this.  In a later emissive he wrote:

|You used a phrase that I'm familiar with: educating students
|who will be consumers of quantitative research. That has been a theme
|in sociology for 25 years at the levels below PhD, and I think is
|appropriate for almost all professional social science degree programs
|unless the individual student is going to be a methodological
|specialist (in which case she should have a couple of courses at least
|in advanced stats and computers). Over the years the most successful
|approach I believe has been to build a course with that purpose around
|primary research: a casebook of articles and research studies. Instead
|of the usual "start at the beginning" methods or stat course, begin
|with the finished product and go back through the steps that had to be
|taken to achieve it.

|However this approach tends to overburden the individual
|instructor and eventually become idiosyncratic. Readers get published
|for such a course (continually) but usually go out of date pretty
|quickly and may be too narrow for somebody else to use successfully.
|That is, at least, my experience. It's also my experience that
|the better the grounding in elementary stat, the more the student will
|get out of that course. But if you can put together such a reader it
|can help demystify multiple regression and other multivariate
|techniques which are usually presented too cleanly and simply in texts
|for social scientists (so that the teacher can construct multiple
|choice tests from the material?).

You're right on the money Neal.  A few years ago I put together a
collection of readings I pretentiously called, "Exemplars in Planning
Research."  It's an awful lot of work keeping it up to date, and quite
expensive for the students.  I'm going to try the jigsaw approach this
semester as a better alternative, providing I have the time.

Kelli Klebe, in Psychology at the University of Colorado at Colorado Springs,
said, "you are trying to do a lot in one course."  Keith O'Rourke, from the
University of Toronto, said the course as outlined "would be most ambitious
even for Phd graduate students or post graduate fellows -- at least
for a statistics graduate program in regard to "isms".  Phil Sidel said:

|your course objectives sound very ambitious -- I think
|they may be too ambitious.  I would suggest deciding on some phase(s) of
|it that should be postponed till you have covered the rest well.  Then if
|you don't get to that part of the course, at least you have them started.

Phil goes on to describe one way to get around the problem:

|When I taught "Computer Applications in the Social Sciences" I spent the
|first half of the course teaching the tools and general computer literacy.
|Then in the second half it was seminar style -- each student did a project
|and presentation on a different application (slightly over half were
|statistical analyses of varying degrees of sophistication--depending on
|the capabilities of the student and preparation time).  Thus, all students
|were "introduced to" some various higher level applications, but only one
|had to actually learn and do each one.

Of course, this only works if the sole focus is computer applications.  I
suppose the basic structure -- half a semester of instruction, then let the
students lose on individual topics -- could be adopted to a broader range
of concerns.

One of the more interesting replies in this vein came from Stephen Laffey
a second year grad student in geography at Northern Illinois University.
Steve says:

|You are attempting to do too much in too little time.
|
|Our program at NIU has a 2 semester sequence to cover what you wish to do in
|one (GEOG 561 & 563).
|
|The first course is taken as a sort of intro to grad school where we are
|introduced to the 'science' of geography. The course "begins with an intro to
|the methodology of science and scientific inquiry".  The primary goal of this
|class is to learn how to structure creative energy into meaningful channels
|of scientific inquiry.  The primary product is a thesis proposal.  This class
|serves as excellent preparation of how to approach problems in a 'professional'
|manner.
|
|The second course is primarily an advanced research class that incorporates
|research concepts learned in the first class with advanced quantitative
|methods.  By this time most students have already had at least two classes in
|statistics so the class centers more on what statistical analyses are
|appropriate for what types of problems.  The primary product of this class is
|a full blown research paper that utilizes at least one method of quantitative
|analysis.  SAS is the primary tool used, as well as ARC/INFO for those who are
|GIS oriented.  Neither of these programs are user friendly, but are essential
|tools to be mastered.
|
|Both classes are taught by the same individual, Dr. Andy Krmenec, and provide
|an excellent introduction to the science of geographic inquiry.  These classes
|include professionals from many disciplines, including planning agencies.  In
|theory, at the conclusion of these two classes, an individual would have the
|introduction, background and methodology of their thesis completed.  In
|reality, most of us realize that our initial concepts of what makes a good
|idea a thesis, are not very accurate and out of the dozen or so students who
|went through the 2-semester sequence together, only one or two actually used
|the same research idea for their thesis.
|
|To summarize, I would suggest that you scale back your expectations and use
|the two semester approach.

I wish I had the luxury of doing this.  It appears that by the time these
students have finished their first year they have taken two graduate-level
statistics courses plus this two-semester sequence in social research.
Notice that the "research paper" in the second semester is neither a thesis
nor a proposal.  This seems like good pedagogy to me, since a research
proposal is awfully abstract unless its author has had first-hand
experience in doing research.

Steve goes on to discuss the course sequence in broad outline.  It does
sound excellent to me, although it leaves out other things that might be
equally useful (an introduction to computer methods with word processing
and spreadsheets is one example):

|We did not use any particular text, but did a series of readings representing
|many viewpoints and then discussed the readings in class.  The proposal and
|paper were presented in class at each stage; research question/literature
|review, concept paper/proposal outline, 1st draft/mini proposal and final
|draft/full proposal (561/563).  These 2 classes form the core of the graduate
|experience at NIU and provide the professional background required for future
|success.  The peer review of research projects is of particular importance
|since one must become accustomed to receiving AND giving constructive
|criticism in the 'real world'.
|
|Some of the main topics covered include:
|563:    Scientific truth                what is knowledge
|        scientific writing
|        history of scientific thought, all of the ism's
|        data, data, data                data mining
|        theory & model building         ETHICS & RESPONSIBILITY
|
|561:    linear algebra review           maximization/minimization
|        probability                     Hypothesis testing
|        ANOVA/MANOVA                    covariance
|        regression                      regression diagnostics
|        factor analysis                 principal components
|
|I hope this provides you with an insight from the student point of view.
|Good Luck with your class!!

Certainly this enthusiastic note from a student is testimony to the course
sequence's appeal.  However, judging from the topics covered, I wonder what
background the students have and how they view geography.  It appears to me
they may have more of a scientific outlook than many of my students do.
Many of my students come from non-science backgrounds (e.g. landscape
architecture) or, at the other extreme, natural science backgrounds (e.g.,
geology).  I find that my students rarely have background in both social
science and mathematics sufficient to tackle the subjects Steve mentions
(see Section 5.4.1).

After giving Steve some more information about my course, he suggested:

|Given your constraints, I would focus mainly on the practical aspects of what
|scientific inquiry is and how the principles of science can be applied to
|problems in planning.

|I think the open forum discussion presentation platform is one of the best
|components and would recommend using that approach to the largest degree
|possible.  Especially if most of your students are planners, or hope to be,
|they need to get comfortable in front of groups and learn how to think on their
|feet.

Apparently other schools are having similar "growing pains" integrating
computers into their curricula.  Steve goes on to say that at his school:

|'93-94 will be the first year that cartography will finally be taught on
|computer.  In your case, maybe you could minimize the fluff and concentrate
|on the content by using the problem solving approach.  I think
|your students will appreciate being taught real content and being challenged
|intellectually to improve their scientific reasoning and problem solving
|skills.

|I have found that it is the ability to structure and solve practical problems
|that I am most grateful for learning here.  Without the deductive/inductive
|dichotomy to provide a critical context to make decisions in, it is hard to
|move on to the next plateau where you can learn technique and apply the results
|to questions that you are interested in.

|I am not real familiar with the pressing problems in the planning discipline
|but maybe if you start with some of those problems and work backwards
|to discover which techniques and methods are commonly applied to solve them,
|you can provide a rationale ... on why you ...  chose to concentrate on
|[what you chose to focus on].

+This assumes the choice of method is pragmatic and dispassionate
+evaluation.  Unfortunately, my experience leads me to reject this
+hypothesis.

Ron Roeder, an undergraduate sociology major at the University of Texas
at Dallas, also liked this aspect of the proposed course:

|Best of luck with your class.  I envy your students the participatory
|element they are going to enjoy, and wish I were there to take part in the
|class.  My library is heavy with tomes on the family, changing family
|structure, etc.  I need the urban / community concentration.

Of course I agree with Ron.  I always find urban issues, which represent
the combination of large-scale social processes, to be far more interesting
than the micro issues in the typical course on the sociology of the
family.


2 Textbooks on Social Science Research Methods
------------------------------------------------

Here is my original discussion of the issues:

>I plan to use _Doing Urban Research_ by Gregory Andranovich and Gerry
>Riposa (Sage, 1993).  This book has several virtues.  It is short (98 p.
>of text), has class exercises, considers diverse theoretical orientations
>and considers non-positivist urban research (e.g., Mike Davis, Mark
>Gottdiener, etc.) to be today's mainstream, covers key issues (spatial
>levels of analysis, the unity of theory/basic/applied research, research
>design, data collection, qualitative research, and reporting research).
>Obviously, the coverage in such a short book is cursory.
>
>Still, I have some misgivings about the book.  It presents "the scientific
>method" as the standard Popperian account of deduction, hypothesis, etc.
>It emphasizes causality, and its notion of causality goes little beyond
>that of Hume and JS Mill.  Its abbreviated form and common-sense writing
>style may gloss over very complex issues while giving too brief treatment
>for students to master the material adequately.  Its treatment of epistemology
>and the philosophy of science is almost nil: I would like much more
>explicit discussion of the various epistemological "isms" (positivism, realism,
>subjectivism, rationalism, postmodernism, feminism, etc.).  The political
>implications of epistemological positions are never discussed.  Etc.
>Can anyone suggest another social science research methods text to use either
>instead or as a substitute?


2.1 List of Suggested Texts
---------------------------

Babbie, Earl R.  _The Practice of Social Research_ 6th ed. Wadsworth.

Bainbridge.  _Social Research Methods and Statistics: A Computer Assisted
        Introduction_

_______. _Survey Research: A Computer-Assisted Introduction_.

Black, James A and Dean J. Champion. 1976. _Methods and Issues in Social
        Research_.  New York: Wiley.

Boyan, Robert and Steven J. Taylor. 1975. _Introduction to Qualitative
        Research Methods_.  New York: Wiley.

Burton, Thomas L. and Grodon E. Cherry. 1970. _Social research Techniques
        for Planners_. London: Allen and Unwin.

Light, Richard J., Judith D. Singer, and John B. Willett. 1990. _By Design:
        Planning Research on Higher Education_ Cambridge: Harvard.

Mattick, Paul. 1986. _Social Knowledge: An Essay on the Nature and Limits
        of Social Science_. Armonk, NY: M.E. Sharpe.

Patton, Michael Quinn. _Qualitative Research and Evaluation Methods_. Sage?.

Rosenthal and Rosnow. 1991. _Essentials of Behavioral research_. NY: McGraw Hill

Wiseman and Aron. _The Student Sociologist's Handbook_


2.2 Comments on Suggested Texts
-------------------------------

Byron Davis, from the University of Utah (?) recommended Babbie
as an excellent book reasonably well received by the students.  T. Robert
Harris, from the University of North Dakota, also recommends Babbie,
and the Wiseman and Aron handbook.  Babbie seems OK to me.  It is the best
standard "research methods" text I received in time for course adoption,
so I decided to use it.  It is well written and organized, and it comes with
useful teaching aids  in the form of an instructor's manual and student
workbook.  I was also swayed by Ron Roeder.  Ron, you'll recall, is an
undergraduate sociology major.  He writes:

|My Research Methods course used Babbie's "The Practice of Social
|Research".  It may be too abstract in some places, or not enough, but it
|has a nice section on the 3 factors necessary to establish causality - though
|it's not particularly esoteric in its arguments.  It may be too simple for
|a grad course such as you propose.

Hey Ron, don't assume graduate courses are necessarily any more complex than
undergraduate courses.  Given the diversity of student backgrounds in planning
school, simplicity is a definite virtue.  Nonetheless, Ron's mention of THE
"3 factors necessary to establish causality" illustrates the Babbie's
positivism, and this is precisely the dogma I want to counter.
I will attempt to counter Babbie's positivist view of science with some
supplemental reading and a required book review (see the course overview below).

I have not yet had a chance to look at most of the texts mentioned above.
Phil Sidel says he has not seen a book that does a better job of presenting
practical research issues than Bainbridge's _Survey Research_.  However,
the book's narrow focus on survey research does not suit my purposes.

Kelli Klebe says Cook and Campbell is difficult for students (1st year
psychology graduate students) to read but has very good coverage of validity
and the kinds of statements that can be made in nonexperimental research.
Jack Needleman, at the Kennedy School of Government's Wiener Center for
Social Policy, says Light, Singer, and Willett "does an excellent job
of addressing issues like methods and measure selection, question design,
sample size, etc."  He also says Patton gives an excellent overview of
qualitative research methods.  Peter Flom, at Fordham, likes Rosenthal and
Rosnow for social science research.


3 Supplemental Reading
------------------------

This is how I originally posed the problem:

>Partly to deal with these shortcomings, I want to use a "jigsaw" approach
>in class sections.  This teaching technique uses 3-4 supplemental readings
>per course period.  Students must read at least one of the readings, and
>in class they break up into groups in which the students collectively have
>read every reading.  They then share what they read with the other students
>in the group, and the group works together to answer a question or do a task.
>Then the groups report to the class, and class discussion focuses on the
>topic of the day.
>
>Different topics I'd like to treat this way include: the relation between
>theory, knowledge, and practice; causality; conceptions of science and
>human knowledge; the uses of research; the relation between theory and method;
>levels of analysis; defining research goals and objectives; literature
>reviews; research design; data collection; data sources; reporting research;
>design of research instruments; overviews of data analysis techniques; etc.
>Can anyone suggest readings that might be useful in this setting and for
>this purpose?


3.1 List of Suggested Readings
------------------------------

Brooks, R J, et al. 1978. A Note on Forecasting Car Ownership _JRSS_ Series A,
        141, Part 1: 64-68.

Campbell. "Prospective: artifact and control" in _Artifact in Behavioral
        Research_, ed. Rosenthal and Rosnow. New York:Academic.

Campbell and Stanley. _Experimental and Quasi-experimental Designs for
        Research_.

Cook and Campbell.  _Quasi Experimentation_

Feagin, Joe, Anthony Orum and Gideon Sjoberg. 1991.  _A case for the case
        study_.  Chapel Hill: University of North Carolina Press.

Holland. "Statistics and Causal Inference" _Journal of the Amer. Stat. Assoc._,
        81:945-960 with discussants.

Sage _Scientific Geography Series_ books.

Sage _Quantitative Series_ books.

Weber, Karl E. and Tilakasena Abeyrama. 1984. "Research in Planning for
        Human Settlements: Guidelines and Practical Examples". HSD Reference
        Paper no. 2, 184 p.

________.  1988. "Observation - An Effective Alternative to Oral
        Communication: Nine Topical Surveys Schedules". HSD Reference Paper
        no. 9, 37 p.

Weber, Karl and Indra P. Tiwari. 1992.  "Research and Survey Format Design:
        An Introduction." HSD Reference Paper: 23, 109 p.


3.2 Comments on Suggested Readings
----------------------------------

These readings fall into three categories: data analysis, research methods,
and the philosophy of science.  The latter are in Secton 2.3.

3.2.1 Data Analysis
-------------------

To make clear that assumptions about the form of the errors are important,
Mike Fuller (Canterbury Business School, University of Kent) suggests the
Brooks, et. al. piece:

|which makes the point graphically (in both the literal and metaphorical
|senses).  ... The logistic model is used to estimate the saturation level
|for car ownership.  The only difference between the two models is the
|assumption about the errors; are the errors additive for estimating persons
|per car, or cars per head? The difference is substantial, with each central
|estimate of the saturation level being outside an approximate 95% confidence
|interval based on the other error model.


2.2.2 Research Methods
______________________

Hari Srinivas (Department of Social Engineering, Tokyo Institute of Technology,
suggested the items by Weber, the Dean of the School of Environment,
Resources, and Development at the Asian Institute of Technology.  These items
can be obtained from Professor Weber at:

        School of Environment, Resources and Development,
        Asian Institute of Technology
        G.P.O. Box 2754
        BANGKOK - 10501
        Thailand

Hari also says, "I have a list of 24 other books that can be used as
additional references.  I'll send it if necessary."  Now please don't
deluge him with requests (you'll notice I am not giving his e-mail
address).

Kelli Klebe describes Campbell and Stanley as a short book (about 80) pages
with no "fluff".  She also describes the Sage Quantitative Series books
which cover such topics as: "causality, experimental design, factor analysis,
covariance structures and about any other statistical method you can think of."
As a graduate student Kelli took a methods course that used these books as
"supplemental reading" in the manner I described.  Presumably she was happy
with the outcome.

Jim Fisher also likes the Sage series.  He says, "there are lots of little
Sage books that can fill in the gaps on methods and even a fairly good one
(series) on spatial analysis - Scientific Geography Series - though it is
a course on its own."  I am unfamiliar with the _Scientific Geography
Series_; I'll have to look into it.  Jim also likes Cook and Cambell as a
"fair bridge from epistemology to practice."

I am familiar with the Sage series and own several of the books myself.
However, they tend to focus too much on technical issues for my course.
Nonetheless, Sage provides a valuable service by offering this series.

Dean Nelson, who describes himself as a "biometrician turned applied social
researcher", offers the following argument in favor of Campbell and
Stanley:

|I have been involved in several social research projects while working on my
|PhD and have helped dozens of other graduate students with the statistical
|analysis of their dissertations, all social research projects.  What all of
|these have in common is the lack of strict experimental control.  I suspect
|that the research done by those in urban planning is quite similar, that a
|quasi-experimental design is applied to the data in which there is no
|explicit randomization or strict environmental controls.  A good little book
|for identifying a lot of these problems is  "Experimental and Quasi-
|Experimental Designs for Research" by Campbell and Stanley.

Dean also suggested Campbell's "Artifact and control" article and went on

|to tell the following story about it.  The President of Lehigh was reported
|in the paper as having a salary which ranked among the top for college
|presidents.  When asked what he felt about making that amount of money,
|he replied, "I know I could be making a lot more money in industry,
|but the work I do at Lehigh is very rewarding." This answer was ingenious,
|because it turned the assumption of the questioner on end.  Instead of taking
|the question as one asking him to defend his salary as being too high,
|he defended it as being too low.

|In the same way, Campbell (in Artifact and Control) answers the question of
|"What can we know?".  Usually, research results in the social sciences,
|especially among those who should know better, are presented as little bits
|of knowledge we have obtained, and they are defended against criticisms.  In
|this paper, Campbell comes from the other direction, first admitting that
|the number of alternative explanations is infinite, and then adding,"but the
|number of plausible alternative explanations is quite small" in comparison.
|This somewhat cynical view is my own, that we know much less than we admit,
|and that those who have the least reason to assert conclusions are the very
|ones who defend them most vigorously.  I think the admission of an infinite
|number of alternative explanations dismisses the myth of knowledge in favor
|of belief, and that some explanations are more plausible than others allows
|us to judge the rationality of our beliefs with respect to alternative
|beliefs.  This seems to me to be both a realistic and hopeful view.

|Dean also summarizes the Holland article as arguing there is "no
|causation without manipulation".  In other words, causality can only be
|shown in designs which contain variables manipulated by the experimenter.
|So, for instance, sex can never be shown to be a cause of anything.

In a very different vein, Anthony Orum, in the Sociology Department at the
University of Illinois at Chicago, suggested the Feagin, Orum, and Sjoberg
book on case studies.  Here's what he has to say about it:

|[It] discusses both the logic of case study/qualitative research and provides
|a number of examples of case studies applied both to cities and to other
|realms.  The book got a very nice review in CONTEMPORARY SOCIOLOGY and,
|I think, provides a very nice presentation of and defense of qualitative
|kinds of research on, among other things, urban topics.  Usually I would
|hesitate to recommend my own work, but in this instance, you might find
|our book most helpful -- and it is readable.


3.3 Anti-positivist Supplemental Readings
---------------------------------------

Most of the books discussed above supplement the basic positivist view of
science.  Since most standard social science research textbooks already
cover this topic in some depth, the topic of anti-positivist views of
social science deserves its own heading.

3.3.1 Suggested Anti-Positivist Readings
----------------------------------------

Alcoff and Potter, ed. _Feminist Epistemologies_

Duhem, Pierre Maurice Marie. 1985. _Medieval cosmology: Theories of
        infinity, place, time, void, and the plurality of worlds_, ed. and
        trans. Roger Ariew.  Chicago: Univ. of Chicago.

---------. 1991. _The origin of statics: The source of physical theory_.
        Boston: Kluwer Academic.

Goodman.  _Empirical Knowledge_.

Gould, Stephen Jay.  _The Mismeasurement of Man_.

Hacking, Ian. 1983.  _Representing and intervening : introductory topics in
        the philosophy of natural science_.  New York: Cambridge University
        Press.

Harding, Sandra. _Whose Science, Whose Knowledge_.

Jaki, Stanley. 1979. _The origin of science and the science of its origin_.
        South Bend, Ind: Regnery/Gateway.

Reinharz, Shulamit. 1992. _Feminist Methods in Social Research_. Oxford

Sayer, Andrew. 1992. _Method in Social Science: A Realist Approach_, 2d ed.
        London: Routledge.


3.3.2 Comments on Suggested Anti-Positivist Readings
----------------------------------------------------

Keith O'Rourke summarizes the Hacking book as being "written for a
one semester course in "isms".

Catherine Dalzell, from the  Canadian Red Cross Society, gave one of the
most thoughtful discussions of this topic.  Basically, she argues against
using _any_ text because an understanding of science is not gained by
reading it in abstracted form.  She summarizes her argument by saying,
"I think the ``text'' you are looking for is an undergraduate philosophy
program.  I doubt you will find any shortcuts down that road."  Here is how
she comes to this conclusion:

|Is this [covering philosophy of science] necessary?  I ask, because
|personally I think that although it is good for us scientists to read
|that sort of material, I think it should be taught by the pros.
|The philosophy of science gets you into ``big-time'' philosophy together
|with the history of ideas:  there is not a major philosophical issue that
|is not essential to this area:  you need epistemology, metaphysics, and
|in the case of a social science, ethics.  (Why?  because you have to have
|some idea of what the good life looks like.  Maybe people really flourish
|in high rise apartments surrounded by free-ways.)  You also have to know
|how mathematics fits into the scheme of things. You need to decide if
|you believe in God, and if so, what kind of God.
|
|I was recently asked to review a book on the philosophy of science
|that touched on the theories of Popper, Lakatos and other moderns, so
|I have had to give the question some thought.  None of the theories
|really seemed to connect with my experience of what happens when I am
|thinking scientifically. None of these philosophers had grappled with
|the distinction between a theory of scientific method on the one
|hand, and the practice of statistical experimental design, on the other,
|which is the **technique** often used to ensure that a sound method has
|been employed.  Perhaps that is the purpose of science:  to take a
|philosophical precept and turn it into a mathematical argument.
|
|I think you could do your students a lot of good by pointing out the
|superficialities of their textbook in this area;  as to the correct
|approach to urban research, maybe  they will acquire that by watching
|how you and their other instructors approach the material. Basically,
|it's the subject matter that must determine the method, and not some
|theoretical construct about the ``scientific method'' lifted from
|another discipline.

I couldn't agree more!  Of course, that's precisely the point made by
Bachelard, Althusser, and the other French philosophers who use the term
"problematique" instead of "paradigm" to capture the notion of theory-based
ontology and methods.  I'll be interested in seeing what people think of my
method for doing what Catherine suggests.

She goes on to say:

|Revolutionaries are always products of the Ancien Regime.  To understand
|the revolution you need to study the period before it took place, not
|the self-serving justifications served up afterwards.  Similarly, my
|experience has been that I learned a lot more about why science works
|and what it is by reading Aristotle and the Medieval philosophers than
|Hume, Mill, the Vienna Circle, and all these others writing after the
|fact, and trying to recruit science for their own agendas.  Stanley
|Jaki is a good reference, as is Pierre Duhem, although there is no
|substitute for the originals.  Of course, the Medievals wanted science
|but did not know what it would look like when they got it (like all
|revolutionaries!).  What we got does not fit in with any one of their
|numerous and contradictory theories:  science actually seems to  have
|originated in a combination of nominalism in observation, aristotelianism in
|epistemology, and neo-platonism in math, kludged together with the faith
|that science is an activity that God wants men to pursue.  It's a dog's
|breakfast from the philosophical point of view.

|I think it is fair to say that since the 16th century, metaphysics
|has had very little impact upon the development of knowledge.  A philosophy
|of science should guarantee the validity of that science, but if there is
|no universally held philosophy, or philosophical school, or philosophical
|method --- which there is not any more --- then I don't know if there is much
|benefit to the exercise.  In practice, the actual sciences as practiced
|and as they refine their own methods of proof, are very much richer
|and more nuanced than the theories used to explain them.  Doesn't Popper
|strike you as being thin?

Absolutely flat, Catherine.  Still, almost all the textbooks on social
science research methods make some attempt to contextualize the methods
with an account of "the" scientific method.  So your point about countering
the text is well taken.  However, I have never been successful just teaching
students the textbook is BS: generally, I have had to offer a positive
alternative.  Also, there is a real danger that planning students will
leave planning school with an implicit epistemology that cognitive
psychologists call "multiplicity."  This is the belief that _no_ knowledge
_is_ better than any other.  If we just debunk the positivist account of
science, we probably add to the students' tendency to view all knowledge as
simply one or another "perspective."  Yet given all the constraints on the
course, I can't do much to offer another perspective.  My solution is to
compile a list of anti-positivist books and to require each student to read
at least one of them.

The Goodman book is perhaps the least anti-positivist suggestion.  It comes
from someone who, because of the tone of his comments, I will keep
anonymous.  There's no sense in having some assistant professor's tenure
made problematic by me broadcasting this individual's name along with comments
about departmental colleagues on the net:

|I would not suggest it as a text, but he makes an extremely sound argument for
|empiricism as the basis of all knowledge.  He ties together realism and
|skepticism and demonstrates how these may actually be counter-knowledge.  He
|doesn't idolize Popper, which in my opinion, is a fresh view point.  I come
|from a strong empirical background and currently work in a faculty where
|process and causality are the only valid approaches to solving problems.
|This book has been a God-send in enabling me to fire rockets into the base
|of this narrow viewpoint of science.  Last year no descriptive studies were
|admitted as valid scientific endeavors, on Monday our bastion of reductionist
|science admitted that descriptive studies were fine as long as hypotheses
|were not artificially imposed on the proposal.  Progress and broadening,
|I think (?).
|
|If you can achieve it, present as many viewpoints about science and the
|scientific method as possible.  There are many excellent critical essays
|of the scientific method as THE method of science.

I don't know how anti-positivist a book advocating empiricism is.  But if
it seriously considers realism and skepticism, and if it critiques old KP,
then it's a definite step up from the usual clap about "science" in most
social science methods texts.

John Buttrick, in economics at York University, suggested Gould's
_Mismeasurement_.

Randy Albelda, from UMass-Boston, had some of the most useful suggestions
and comments:

|There's been a mini explosion of good feminist critiques of scientific method
|and epistemology -- mostly by feminist philosophers.  My favorite is Sandra
|Harding .... There's also a good edited volume called _Feminist
|Epistemologies_ Alcoff and Potter, editors.  What's nice about these types
|of pieces is that are critical of 'objective science' without dissolving
|into post-modernism.  They do a good job -- especially for women students
|-- of talking about what it takes to be a scientist (i.e. having a wife)
|-- which is by no means a neutral observation.  I recently picked up
|_Feminist Methods in Social Research_ by Shulamit Reinharz which is much
|less heady then the above -- no economics but talks about oral history,
|case studies, cross-cultural research.  Could be a very nice complement.

Stephen Boyce, in Political Science at the Australian National University,
suggested Andy Sayer's book.  Stephen writes:

|It's got an excellent critique of Poperian and other empiricist views of
|science and puts the critical/transcendental realist view quite clearly
|and concisely.  It aims to provide students with a critical orientation
|which they can use to design/conduct their own research.
|
|**My reservation**: Sayer (like Bhaskar and some others in this school)
|understates the role of mathematical modelling in physics (and other natural
|sciences) - this means that his critique of `quantitative' research methods
|(mathematical modelling) in social science needs to be treated very cautiously.

To add my own two cents, Sayer's book is particularly useful in planning
since it draws on what Sayer calls "the urban sciences" for examples.  Some
of the realists I'm familiar with tend to confuse "naturalism" with
"realism."  In other words, they argue that a single model of science
unifies the natural and social sciences; they then go on to critique the
positivist view of natural science (often drawing on physics for examples)
and substitute a realist view; and then go on to argue by analogy using
a reconstructed view of physical science as a model for the social sciences.
I have the most trouble with the last step: just because the underlying
structure of inquiry is identical in the natural and social sciences does
not mean the techniques used in each must also be identical.  The
appropriateness of a given technique depends on the subject matter and the
theory used to orient the research.  So, not having read Sayer cover to
cover yet, only if he insists on using physics as the model for social
science would I think understating the role of mathematical modelling would
be a serious problem.


4 Guides or Aids for Writing Theses and Thesis Proposals
----------------------------------------------------------

In the original, I described this as "self-explanatory".


4.1 Suggestions
---------------

Allan Feldt sent me a long (3600 word) document he give to planning
students at Michigan.  It is very good and chock full of tidbits on things
like self-discipline.


5 Jump Starting a Thesis
--------------------------

The problem I posed had no easy solution, which is why I changed the course
objectives as described above:

>Our curriculum does not give the students much exposure to substantive
>issues before they take this course.  Hence, asking them to prepare their
>thesis proposals really is sort of a dry run.  The best thing I can tell
>students preparing proposals is to immerse themselves in the literature on
>the topic.  Yet since this is a methods course, I really can't expect students
>to do literature reviews on low-income housing, economic development policy,
>etc.  Can anyone suggest a way to get students familiar with the research
>literature on a topic in very short order?

However, I did get some excellent ideas on helping students make progress
towards their thesis proposals in the context of this course.

5.1 Suggestions
---------------

Peter Meyer at Louisville argued it is futile to try to "jump start"
a thesis in a methods course before students have a fair amount of in-depth
exposure to substantive planning literature:

|I don't believe you can jump-start a thesis in a semester while doing
|other things... HOWEVER, you can, if the library is good enough, get students
|to at least probe the range of existing literature they'll want to tap, using
|the CD-ROM citation (and abstract, most importantly) data bases. This will
|force them to explore different keywords and to recognize that language
|differences in different disciplines may have to be overcome to generate a
|good synthesis of existing literature on which to build.

Dennis Roberts made a similar comment and argued to treat data analysis in
a separate course (a luxury I do not have):

|1) Seems to me that doing data analysis in this course given that it only
|leads one to a draft type of proposal ... is counterproductive.  If there
|is no real connection between the proposal AND the labs, what is the purpose
|of the labs?  Did they not have some labs in the intro stat course you mention?

It seems to me one could argue the reverse, if there's no real connection
between the proposal and the labs, perhaps the proposal should be gotten
rid of.  In fact, I decided to do something along these lines because of
the impossibility and inappropriateness of having the students comprehensively
review the literature on their chosen topics.

Incidently, the students must take an undergraduate statistics course
before they come here.  So, I do not know what they studied or how
the course was taught.  I do ask them to list the course they took and tell
them I assume they know certain statistical topics.  But that's the
extent to which I have control over the content of this prerequisite.

Dennis went on to suggest emphasizing the literature's substantive content
over data analysis:

|2) I would put more stress on carving out some lit area and having them review
|that ...  rather than the labs.  You mention that they don't have much
|exposure to substantive content but ...  a proposal DEPENDS on a reasonable
|grasp of the content to be ABLE to design some investigation to improve
|on what is already done.  3) Why not approach this from a replication
|point of view ...  have the end product be a replication proposal and
|THEN maybe the labs could simulate the types of data needed in such
|a replication.

I think this is a good idea, although unworkable in the form that Dennis
suggests.  Replication studies are very rare in planning (and most other
social science) research (please no flames!), so students are apt to see
this as irrelevant to their concerns.  Furthermore, simulating data is one
of the more difficult skills to master with SAS, and I'd just as soon stay
away from having students simulate data.  Nonetheless, I probably will use
Dennis' suggestion for an extra credit exercise (see my overview of the
course design, below).

In a somewhat different vein, Kelli Klebe says the introductory methods class
for 1st year psychology students at the University of Colorado at Colorado
Springs does have students do first draft thesis proposals.  However:

|we also suggest that the only way to get an idea is [for the students] to
|immerse themselves in the literature.  But since this is a course in methods
|and not "substantive topics" we do not have them include a full blown
|literature review in their proposal.  Just a relevant 5-10 articles which
|helps set the stage for hypothesis and variables.  We figure that their final
|thesis proposal may change and part of the goal here is to give them a kick
|start on thinking about it and thinking about design with something that they
|find "more relevant" than instructor thought-up ideas.

Hari Srinivas, from the Tokyo Institute of Technology, gave the following
practical suggestion on how to structure this:

|One group of students will prepare abstracts for previous selected "good"
|theses, but with an added purpose of understanding the methodology of writing
|theses and proposals.  Presentations can be made by this group to the rest of
|the class.

I currently do something like this, but I leave it up to the students to
select the thesis.  Hari's idea seems better, since it puts a tighter
control on what the student learns to strive for.

Hari recommends building on this experience by having the students then
turn to the research literature:

|[I] use the same technique of "supplemental reading" mentioned by you.
|Ask students to prepare abstracts from several journals, in groups, and then
|file them; interested student can go into more details after reading these
|abstracts.
|
|Here's a format for preparing abstracts:
|......................................................................
|Library where book is available:
|Library Classification number:
|------------------------------------------
|Authors:
|
|TITLE:
|
|Place of Publication:
|
|Publishers:
|
|Year:
|
|Number of pages,figures,diagrams,maps,tables,
|graphs, illustrations,...
|
|ISBN:/LCC no.
|
|Type of Research:
|* exploratory         * evaluation   * documentary
|* hypothesis testing  * field
|* case studies        * comparative  * longitudinal
|* descriptive         * analytical   * cross-section
|* ...........
|
|Year(s) when research was conducted:
|
|Abstract/Annotation: (This should cover--> )
|------------------------------------------
|     - objectives
|     - sources of data
|     - study scope
|     - methodology/research techniques
|     - nature of conclusions
|..............................................................................
|
|The other technique that I often use is the "Chain-reaction" method.  Take
|one reference, first go through its references and/or bibliography mentioned
|in the end, search for these books/articles and then read the references
|mentioned in them too, and so on...until I have a large database of books
|and articles on the topic.

Incidently, Hari is an architect/urban planner with a masters from the Asian
Institute of Technology in Bangkok, and presently doing his doctoral research
at the Tokyo Institute of Technology.

Meredith Warshaw had a suggestion similar to Hari's, albeit a more ambitious
one:

|When I was a graduate student in psychology at Brandeis, we had a two semester
|required colloquium, one semester of which was similar to what you are
|teaching.  The other semester was more a coverage of all major areas of
|social/developmental psychology.  A major component of this course was reading
|giant review papers such as come out in _Psych Review_.  Is there any similar
|journal for urban studies?  or do any urban studies journals publish broad,
|over-arching review papers?  This type of paper provides exactly what you want
|for helping students "jump start" a literature review for their research
|projects.  A typical Psych Review paper, for example, might well have
|a couple of hundred references.
|
|BTW, one excellent tool that the methods colloquium used was having us act
|as if we were reviewers for journal articles.  Twice during the semester,
|we were assigned to critique papers that our instructors had actually
|received from journals to review, replete with loads of errors.  It was
|a useful way of applying our critical research-consumer skills, which is
|a large part of what you want to impart to your students, since I imagine
|that a large proportion of them will go into practice rather than research
|and we are all aware that unfortunately there's a lot of bad research out
|there.

This sounds like a great idea, but again it requires two semesters.
Moreover, the second semester focuses on one particular subarea of
psychology.  There is indeed a planning analogue to _Psych Review_:
_The Journal of Planning Literature_, although the reviews are not quite
"giant".  But planning is extremely broad and multidisciplinary.
Since my course is a core course, I would have to give equal weight to all
subareas of planning, and this would lead to an overly broad base of concerns
if the course focused on the substantive content of planning research.

By the way, Meredith gave a commercial return address, so I infer that she
does not work at a university.  I think it is interesting that a
practitioner speaks so highly of the value of a good course on research
for professionals.


6 Data Analysis: (SAS) Texts and Data
---------------------------------------

This topic actually included several subproblems: overcoming the inherent
difficulty of using SAS; the fact that most textbooks covering SAS put too
much emphasis on elementary statistical methods, the mathematics of
advanced multivariate statistics, or the SAS language itself;  and the lack
of realistic data with a spatial component.

>The class has a 1 1/2 hour per week computer lab which covers data analysis.
>In general we use SAS because it is available around the university,
>powerful, and widely used.  The text I've used in past years is _SAS
>System for Elementary Statistical Analysis_ by Sandra Schlotzhauer and
>Ramon Littell.  However, both SAS and this text have severe problems
>
>Although the students have all had one undergraduate statistics course and
>a introductory methods course that introduces them to computers via a
>wordprocessor (WordPerfect) and a spreadsheet (Quattro Pro), they are
>generally not very computer literate.  SAS is a programming language and
>foreign to most students' experience.  Also, SAS is a bit arcane and
>difficult to use.  For example, if a student leaves off a semicolon,
>SAS errors result but the error messages do little good.  I tried using
>SAS ASSIST, the menu-driven front end for SAS, but there is no good
>textbook on SAS ASSIST and statistical methods, the PC version of SAS
>ASSIST is not documented in any text, and the version on our mainframe
>looks nothing like the illustrations in the SAS ASSIST manuals unless you
>run it from a graphics or X terminal.  The Schlotzhauer book spends too
>much time on very elementary techniques and not enough on the more
>advanced techniques our students need (e.g., regression methods)
>
>Beyond this, there is a problem with using data for "realistic" problems.
>Most data suitable for "realistic" problems are unsuitable from a pedagogic
>standpoint.  For example, census data are aggregated, so regression using
>census data should correct for aggregation (e.g., using GLS).  I would like
>the textbook to be oriented towards "real" social science (preferably urban)
>data, either by providing the data or having the students get them, and to
>take students through analyzing the data, step by step.
>
>Finally, most of our students' interests have a spatial component.  It would
>be most desirable to have spatial data and to use SAS' various mapping
>and graphics features.  Of course, this would probably imply having the
>coordinates for geographic features.

I have organized the comments on this part of the query under several
subtopics:

6.1 Appropriate Statistical Software
------------------------------------

6.1.1 SAS: Pro and Con
----------------------

Andrew Karp, a Program Manager at Pacific Gas and Electric Company argued
strongly in favor of using SAS:

|I think you should introduce your students to SAS.  It is a 'real-world'
|package, and you will be giving them a definite competitive advantage for
|future employment if they learn how to use SAS.  In addition to my full time
|job at PG&E I teach a number of semester length classes in SAS
|programming and applied statistics using SAS in the Extension Division at
|UC Berkeley.  I get a lot of 'working professionals' or 'soon to be working
|professionals' in my classes who learned how to use Minitab or SPSS-X or
|some other package while in school, but have come to find out that most
|professional/academic/industrial organizations use SAS.  So, they must now
|learn SAS to progress in their jobs, or to find one in the first place.

|The frustrations you mention with SAS as a programming language are, of
|course, an issue for _any_ new user of the product.  My students seem to
|get over these problems relatively quickly.  Again, since SAS is a 'real world'
|tool, I think its worth a little extra pain at the beginning rather than to
|send someone out in to the 'real world' without even basic knowledge of the
|tools to do data analysis.

Jim Fisher also likes using SAS:

|SAS is my pick, and I think that students do well to learn a programming
|language such as SAS, dBase, Paradox or as a last resort spreadsheet macros.
|I am less enthusiastic about Basic, Pascal, etc..  as they are lower level
|and difficult to apply to real life.  SAS 6.08 for windows is much more
|powerful, has better integration of graphics and a good interactive tutorial.
|Check it out.  They may be able to learn enough in a self-taught tutorial.
|Anyway, sounds like you have the makings of a good course.  Slowly, slowly,

Ron Roeder, an undergraduate with a background in computer programming
testified to the difficulty of using SAS, but he believes this may not be
the fault of SAS only:

|You are right - those SAS messages are not too explanatory.
|But the occurrence of one generally tells where in the step the error was
|made.  I empathize that your students do not seem more computer literate.
|Could this be a function of pure pre-college familiarity with the presence
|of computers and computer-driven devices in their daily lives ?  It has
|been 22 years since I got the Associate's Degree on Computer Programming.
|Guess I tend to think like a machine sometimes.  Wish I had had a strong
|course in high school on the statement of problems and the formulation of
|solutions in some general manner which could be understood by anyone.

Several people agreed with how difficult it is to use SAS for teaching.
Phil Sidel said:

|As you note, SAS makes the teaching harder.  I don't know any good SAS
|texts, and the manuals are frightful.  If I had to teach it, I would
|write my own short user's guide as a handout.

|What is needed is something like the "SPSS Introductory Statistics Student
|Guide" that Norusis wrote for SPSS.

Neil Henry strongly endorses SAS ASSIST:

|I have been very impressed with SAS ASSIST. I'm using it through terminals
|& Kermit PCs on a DEC Ultrix Mini. Version 6.07 I think. In contrast to your
|experience I found the menu screens do match up with those in the SAS book
|"getting started with sas assist". The book was helpful -- I read it once
|and then ignored it.! But it ought to provide a crutch or safety net
|for students, and is relatively cheap. I haven't yet used it formally
|in a class, but have introduced students to it in a couple of training
|sessions. What I think is necessary with menu systems like ASSIST and
|SPSS-Windows (which I have used in classes this semester) is TIME.
|Time to play without a homework deadline. Otherwise it is bang bang, get that
|R-squared, go home, and they are simply memorizing which button to
|push rather than learning to exploit the system. I don't know how to
|accomplish that yet. [HYPERTEXT BUTTON: Teaching Statistics]
|
|But you have to admit, no semi-colons is a great advance! Of course the
|graduates can't call themselves SAS programmers (which is perhaps also an
|advance).

Thanks for the idea Neal, as you'll see, I'm using it!

6.1.2 Alternatives to SAS
-------------------------

6.1.2.1 Alternatives Suggested
------------------------------

        Minitab
        PC Statistician
        S+
        SimCity
        SPSS-PC
        Stata
        UDMS
        SYSTAT


6.1.2.2 Comments on Alternatives
--------------------------------

Hari Srinivas suggested three of the alternatives:

|Tried SPSS-PC? There is now a Windows 3.x version out. May be worth exploring.
|The Windows 1 version is more helpful and easier to use than earlier versions.
|But both DOS and Windows versions are available. Text: the standard SPSS-PC
|manual
|
|UDMS: Urban Data Management Software. My UDMS disk is in 5.25" disks and I
|don't have a 5.25" drive yet, so more info on that later, if you need it.
|
|For a break from back-breaking work, play SimCity -  a game in simulated
|urban planning - good fun, but also gives valuable insights into the planning
|process/data needed etc.!!

Lack of funds and support keep me from considering SPSS-PC, and I've been
too afraid of becoming "hooked" to play SimCity.  I'm going to have to find
out more about UDMS.

Kelli Klebe suggested SYSTAT.  He says:

|The book that comes with the software is very readable and useful.
|It comes with some data sets.  The student version does everything that
|the original version does except factor analysis.  It is only $30 for the
|students to buy and for every X number of copies your bookstore sells you
|can get one free to put on computer labs.  It comes in DOS and MAC versions.
|It is put out by Course Technology, Inc. in Cambridge MA.

T. Robert Harris also mentioned SYSTAT and compared it to SAS:

|if something menu-driven on pcs would be better, such as SYSTAT.
|I don't have personal familiarity with it.  Also, your students would
|want to be able to handle large datasets, I presume (eventually, if not in
|your class).  Or software that includes procedures for spatial analysis.

Jacques Rioux, of the Universite Laval, had this to say about SYSTAT:

|I know SAS is powerful but I think it is too much. Maybe you should look
|into something more intuitive like SYSTAT. It has a nice graphical
|interface, is menu driven and allows you to do almost any data analysis you
|want.

I've never used SYSTAT, but if all it has to offer are menus, SAS ASSIST
may be just as good.  Since at least some of my students (i.e.
those who go on to work for regional agencies or those in larger cities)
may have to analyze large datasets and/or do spatial analysis, SAS seems
more appropriate.

Jack Needleman suggested using Stata because "the front end time costs are
much lower, there are good graphics, and a wide range of canned procedures."
Ron Roeder suggested Mini-Tab and PC Statistician, but he give no description
of them, except that PC Statistician is from IBM.

Alan Zaslavsky says:

|The computer is a problem ... I use Minitab, which wouldn't be adequate
|to your needs, and even that has been a pain.  Although I have moved to
|the menu-based PC/Mac versions, some of the students still have trouble
|(even with very detailed directions), in addition to the problems with
|mistakes by the computer center or difficulties in installation on their
|own personal machines that contribute to making things more difficult.
|I suspect that you won't be able to do much better than SAS, and you
|just have to provide a lot of examples for them to work from so they don't
|have to read the whole manual just to get started.

Catherine Dalzell, from the Canadian Red Cross Society, favors S+:

|Let me put in a word for using S (or S+) instead of SAS.
|I once taught a course to 4th year undergrads, who had taken quite
|a few stats courses using S.  We had a classroom equipped with terminals
|connected to a MIPS computer, where we had loaded the AT&T S.  Given the
|equipment and the computer gurus, it was only about $1000 to get the software.
|It worked well.  With S, the students have to know what they are doing
|to get the computer to give them anything.  SAS and other packages tend to
|do a great "results" dump, giving statistics and p-values for all sorts
|of tests that the students may not understand.  This encourages them
|in the notion that it is o.k. to use the results of techniques that we
|do not understand.  Bad lesson!  I gave mini-lectures on S, as needed,
|and mini-lectures on statistical techniques.  Basically, they had seen
|everything once before, and this course was to give them facility in using
|what they had covered.  I imagine that your students will be in the same
|situation.  A lot of class time was spent working with data sets, under
|supervision.

Although he suggested SYSTAT, Jacques Rioux actually prefers S+:

|Somewhere between SAS and SYSTAT (in ease of use) is S-Plus. This is what I
|use most often. It is very powerful and more flexible than both SYSTAT
|and SAS since you have access to the source code of every routine
|implemented. It is extremely complete.

I have used S+, and I agree it gives a much better "inside" look at what's
going on without getting caught up in FORTRAN-like programming.  Still, I
think it's too much for students who have no prior exposure to computer
programming or the mathematics of statistics.  I hope the approach I am
adapting deals successfully with Catherine's "bad lesson".


6.2 Suggested Texts
-------------------

The texts suggested for the data analysis component include:

Babbie, Earl R.  _The Practice of Social Research_ 6th ed. Wadsworth.

Bainbridge.  _Social Research Methods and Statistics: A Computer Assisted
        Introduction_

_______. _Survey Research: A Computer-Assisted Introduction_.

Box, Hunter, and Hunter. _Statistics for Experimenters_.??

Cody and Smith. 1991. _Applied Statistics and the SAS Programming Language_,
        3rd ed.  Prentice Hall.

Frank Di Iorio. _SAS Applications Programming: A Gentle Introduction_.

Jaffe.  _Mastering the SAS System_.  NY: Van Nostrand Reinhold.

Meyers, Dowell. _Analysis with Local Census Data - Portraits of Change_.

SAS Institute. _Getting Started with SAS/ASSIST_.

_________.  _SAS Introductory Guide_

---------.  _SAS User's Guide_

Spector _SAS Programming for Researchers and Social Scientists_.  Sage.

Tufte's books on graphical display of data.


6.2.1 Comments on Suggested Texts
---------------------------------

These comments are split up into three categories of texts pertaining to
(1) data analysis and (2) related computer software.

6.2.1.1 Data Analysis Texts
------------------------------

Ed Waltz believes that "masters students need as much information on how
to PRESENT statistical findings as they do on how to ANALYZE the data" and
suggested using excerpts from Tufte's books.

John Blodgett, of the University of Missouri's Urban Information Center,
says:

|This is a little bit out of my field, as I am neither an academic nor a
|social scientist.  However, I do work with and do analysis of social
|science data, almost entirely census data.  I thought that "Analysis
|with Local Census Data - Portraits of Change" by Dowell Meyers was an
|excellent book.  It may not be suitable as the primary text in a general
|course since it is specifically focused on the use of decennial census
|data, but it would make an excellent supplementary text.  And an excellent
|practical guide to hands-on applications of the data.

I whole heartedly agree with John's assessment, and I plan to use Meyers'
book for my introductory methods course next fall.  As John
suggest, it's too off-topic for a more general course on research methods,
but I am reading through the book now and may design some assignments around
topics covered in this book.

Ron Roeder mentioned that the Babbie book can be ordered with two diskettes
pre-programmed to show the results of different stat processes on the PC.
These include topics such as correlations, crosstabs, etc.  Babbie also
has several chapters on elementary data analysis.

The books by Bainbridge seem similar to Babbie.  But Phil Sidel says neither:

|is geared to standard statistical packages.  He [Bainbridge] avoids them,
|by providing his own limited, but easy-to-use set of programs on a diskette
|that comes with the text.  He also provides ready to use data and a
|crude data-entry program so you can enter a small class survey.  I devoted
|just half a class period to "computing" -- the students who didn't get off
|the ground at that point (about a quarter of the class) came in during
|office hours for a 1-on-1 tutorial that got their diskettes backed up
|and them running the programs.  The tutorials took about 15 minutes each
|-- sometimes there were two or three students at a time.
|
|One further comment - criticism - on the Bainbridge texts.  There is a
|great deal of time/words spent on detailed explanations of which key to
|press when it is fairly obvious from his menus.  I would prefer a slight
|expansion of his appendix on using the programs and less details about it
|in his text -- but students don't read appendices and manuals (I'm not
|sure they read texts either), so his approach may be the best way to go.


6.2.1.2 Computer Software Texts
-------------------------------

Neil W. Henry, from VCU, says:

|I have also found the texts that describe SAS or
|SPSS and various statistical methods to have too much elementary
|statistics. At this point in time I think that SPSS materials
|are better than those from SAS for teaching purposes. The most
|frustrating thing is having to assign several texts just because
|no single one matches what we want to cover, especially when they all
|cost 30-40 bucks. But the mapping capability of SAS GRAPH is more than
|(my versions of) SPSS can match.


Andrew Karp suggested shared my view of the Schlotzhauer/Little book and
suggested three others:

|There are a limited number of resources for teaching people SAS.  The
|Schlotzhauer/Littel text has a lot of problems with it, as you've pointed out.
|You may want to consider "Applied Statistics and the SAS Programming
|Language" by Cody and Smith (3rd ed) and/or "SAS Applications
|Programming: A Gentle Introduction" by Frank Di Iorio.  Another book, by
|Spector, is called something like "SAS Programming for Researchers and
|Social Scientists", published by SAGE.  I reviewed it, but did not adopt if for
|my classes.

Don Richter also favors Cody & Smith.

I reviewed these three books and decided against all of them because they
emphasize the SAS language, rather than using SAS.

Andrew also suggested developing handouts, which I have done in the
past and will do this coming semester:

|The other thing you may want to do, if you have time, is to
|come up with a set of handouts to distribute to your students.  Also, are they
|any on-campus "how to use SAS" courses taught by your computer center
|staff?  Maybe they can help you out.

Ron Roeder says Babbie has a 30-page appendix on basic SAS programming,
but my inspection copy only has an appendix on SPSSX.  Ron also mentioned
the Jaffe text.  He says it is "longer on prose explanations, and more liberal
in illustrations than the _SAS Introductory Guide_".  However, Ron never felt
the need to use Jaffe because he finds the documentation from SAS Institute
and SAS error messages sufficient.



6.3 Data
--------

Andrew Karp also had something useful to say about data:

|I share your concern about 'realistic' data.  I am always looking for good data
|sets, and there are few to be found.  I don't work in your field, so I am not
|personally aware of where to potentially find 'realistic' data, but I sometimes
|generate data sets using SAS random number generators and PROC
|STANDARD.  Then I get exactly what I want in terms of a data set.

Ron Roeder, suggested using the data distributed with Babbie's book on
social research methods.  Apparently there are several datasets, each with
almost 100 variables.

Alan Zaslavsky thinks differently, and ties his argument to the course
goals:

|Census microdata files and other public use files seem to be the way
|to go.  You can do some preprocessing to get them down to manageable
|proportions and then set them loose on them.  Again, it partly depends on how
|important it is to you that their project for the course be exactly the same
|project that becomes a thesis eventually.  With a two-year program, you may
|not have much leeway on the issue.

Steve Doig,  Associate Editor of _The Miami Herald_, also came out on the
side of using census data, correcting me on some of my assumptions:

|You are correct that most of the 1990 census data released earlier was
|aggregated. However, the Public Use Microdata Sample (PUMS) datasets are now
|out. These are 5% samples of actual returns, which means you can grow your own
|crosstabs. They even include weighting variables, which work easily with SAS.
|Best of all, the PUMS datasets now are available on CD-ROM (as are the various
|STF releases), so this kind of analysis can be done on a PC.
|
|You also mention the spatial component of data. That's another real advantage
|of census datasets -- they include the centerpoint lat/long of every block
|group, tract, etc. I constantly use census data with Atlas*GIS (a PC mapping
|package) to make thematic maps we couldn't have dreamed of printing even a
|couple of years ago.

Ed Waltz suggested using data for health planning.  If the course could include

|I'd encourage you to look at some data sources on health statistics.
|They are a treasure trove of information, and they present an array of
|'problems' (e.g., merging Census and health data files to calculate
|population-based rates) for students to work through. They also lend themselves
|well to geographical analyses. (i.e. I can get tract-, ZIP-, and
|MCD-level data) If you're interested in that direction, I'd start with 'Health
|of the United States' (U.S. GPO-- you probably have it on government deposit
|in your library), which ** also comes on computer disk in summary table
|form**.

While some health planning might be OK for my class, I could not make it a
major focus.  We have no health planning option in our curriculum, what
people call "planning" in Rhode Island tends to be mainly physical,
land-use planning.  Some years ago a student who had been working in a
local public agency for a few years and had a "I've been in the real
world, but you've just been in academia" attitude (do you know the type?)
challenged my use of health examples in my introductory methods course.  I
had to explain to him that Rhode Island's omission of health from standard
planning concerns is by no means universal.  The student was convinced when
I told him the Los Angeles Comprehensive Plan has a separate section on
health.  Until then, the student thought I was simply imposing my interests on
him (which, he assumed, is what all academics do from their ivory towers).


6.4 Should We Teach Data Analysis This Way?
-------------------------------------------

6.4.1 Is Statistics Dangerous?
------------------------------

One issue that came up several times and in different contexts was students'
lack of preparation in mathematics.  This took a variety of forms, ranging
from pointing out that students lack even basic mathematics to arguing that
proper use of applied statistics requires considerable advanced training.
This problem seems common among planning students and other students in the
behavioral and social sciences, even those from graduate programs at elite
universities.

I would hypothesize this is particularly true in Masters programs because
students in these programs have weaker academic backgrounds than those in
PhD programs.  This comment from Neil Henry illustrates this point:

|the trouble with the MS programs I know is that most students had horrible
|undergraduate experiences in math and stat and so have little I can build on.

Deb Niemeier is even stronger in her criticism.  She writes:

|I am a PhD candidate in transportation engineering with much
|exposure to urban planning students, most suffer I would say
|from lack of quantitative skills - sometimes even the most basic.
|I am also a masters student in statistics and suffered frustration
|over the lack of a 'basic' stat course for those less quantitatively inclined.
I'm not sure I understand Deb correctly.  My guess is she's expressing
frustration that the "less quantitatively inclined" do not have
their own course and instead make the "standard" course more watered down.
It always strikes me as strange that graduate programs in planning and
other "applied" social sciences require their students to have undergraduate
degrees but then count on so little from the students' undergraduate
experiences.  It is one thing to have a remedial course in economics for
students who have not had econ as undergraduates or a course in remedial math
for students who have not had college-level math.  But when the remedial econ
class cannot use calculus because many students have not had either
economics or calculus, I wonder why we bother making them go to undergraduate
school at all.

Certainly there is strong opinion that, while they by no means should be the
_main_ focus of social science, quantitative methods are basic to social
science.  For example, Mark Hill writes:

|I think social science research requires a strong knowledge of quantitative
|methods.  The best investigators have skills that rival their peers in
|statistics and math departments (e.g., Sam Preston at Penn).

|Also, I agree that math should be presented to children as a core necessity,
|although I know that in reality most functioning adults rarely use it (beyond
|calculator arithmetic).  Perhaps the ticket is for schools to focus on
|the utility of algebra etc. in real life problem solving in the lower
|grades--before kids decide that it is not worth the effort needed to learn
|basic quantitative skills.

|Personally, I always hated math until I became interested in human
|population applications.  Now my attitude has completely changed.

My only comment is that it is misguided to assume education should be
"useful" in the sense that what is taught must be directly transferable
to everyday life.  The purpose of education should be to teach people to
use their brains.  If that's not useful enough, then I don't know what is.
I don't think the underlying vocational assumptions about the purpose of
education are as strong anywhere in the world as they are here in the US.
misguided

However, this problem apparently is not confined to the "applied" social
sciences.  A statistics professor at Princeton (name withheld to protect
the innocent) forwarded the following note from one of his graduate students:

|A couple of weeks ago I sat in on Sociology 540 (or is it 504) which
|is their grad level stats course.  It was one of the oddest
|experiences of my life.  It was unlike any stats course I have ever seen
|before; my advice to people in the humanities (sic) is that they should just
|hire a statistician when they need some stats done.

So, even at elite schools (can I call Princeton "elite"?), there seems to be
a strong perception that statistics as usually taught to social scientists is
more dangerous than helpful.

Still, the problem at Princeton is more complex than this.  The same
professor later wrote:

|I completely agree with you about the state of mathematical skills.
|A colleague of mine ... has taught courses for us here at Princeton.  For
|years he taught statistics to advanced graduate students in our
|Psychology Dept.  One year we let him teach a data analysis course
|to our seniors in what we call Engineering Management Systems --
|kids who have had 2-3 statistics courses, OR courses, etc etc.
|He thought he had died and gone to heaven.  Part of the problem here
|at Princeton is the Math dept itself which is an incredible research
|dept (including Fermat's last theorem this summer as you probably
|know), but has very little interest or skill in teaching any but the
|few it thinks will be future mathematicians.  This is a dept of 50
|faculty with about 10-12 majors a year!!!

Lest any administrators out there think this is an argument to substitute
a "teaching" for a "research" orientation, let me quickly recount my
experience in the late-1970's at San Francisco State University, an
institution that calls itself a "teaching university."  The Board of
Governors had mandated that all graduates of the California State Universities
master "quantitative reasoning" at the level of "a truly educated college
graduate."  I was on a university-wide committee charged with recommending
the quantitative skills that "a truly educated college graduate" should
master.  We recommended essentially a precalculus course, but one requiring
proofs and similar "hard" and abstract reasoning skills.  When we proposed our
recommendations to the GE committee, who do you think shot us down?  You're
right, the Math Department.  The math faculty knew that if our
recommendation went through, they would have to teach the course.  They
didn't want to have to teach all those mathematically illiterate
undergraduates, no matter how many new faculty lines were tied to the new
requirements.  The Math Department was more than happy to take the cream
of the mathematical crop and send the unwashed masses to the schools of
nursing, social science, and business where passing a "quantitative reasoning"
course meant you could read a line graph and do percentages.  So, I don't
think the problem is one of teaching versus research as much as it is a
problem of teaching students who come out of high school with woefully
inadequate math skills.

By far the strongest argument against even attempting to teach data
analysis in the format proposed came from Herman Rubin, in Statistics at
Purdue.  Herman and I had several exchanges over STAT-L, and I won't bore
anyone with a complete recount here.  The following excerpt should give you
the flavor of Herman's concerns:

|I am quite concerned about what you propose to do, as the crude
|applications of statistics in complicated situations can have quite
|disastrous consequences.  I am quite aware of the problems with
|applying statistical methods in inappropriate situations, and also
|the problem of deciding how to attack problems; I have consulted,
|formally and informally, with people in various fields, and am
|aware of the misuse of statistics in even such "good" fields as
|medicine.  If it is only which articles are published in journals,
|that is one thing, but when strong national policy is going to be
|made, that is quite another.

|The problem is not one of learning methods, but of learning to
|formulate problems for which an adept may even have to produce
|ad hoc methods even to get a solution which can be partially
|relied upon to be a valid use of statistics.  I started out working
|for economists, who had had it clearly pointed out to them that the
|standard regression methods were WRONG.  It was only after several
|steps that something LIKE a regression could be found which was
|reasonably good.  I say like a regression, because some of the
|variables used are not those in the equations.  But it took several
|years of development to get this.

I largely agree with Herman's argument, but unfortunately I don't really
have the choice between teaching statistics "the right way" and being sure
my students won't abuse the little knowledge about statistics gained in
my course.  The issue many people in my situation seem to be grappling with
is how to teach statistics so students (1) won't be snowed when they see a
statistical analysis, (2) are intelligent "consumers" of statistical
analyses conducted by others, and (3) have sufficient skill to do data
anlysis in their theses without making ridiculous mistakes.

I do, however, think Herman's call for "truth-in-labelling" is right on the
money:

|If we labeled our methods courses as "How to compute statistical results if
|they are the appropriate things to compute, but you will not learn in this
|course if the assumptions are satisfied" will not do too much harm.  We
|should try to teach the concepts, and we do not even do this for our majors.

However, the ontological assumptions underlying most applications
of statistical methods seem an even worse problem than assuming the
mathematical assumptions are satisfied.  If the world consists of
processes and relationships, there is great danger in a mathematical method
that encourages one to think in terms of discrete, atomistic "variables"
having their own causal powers.

Byron Davis took my side in this debate. Thanks Byron:

|I find that Marsh's last comment hits home for me. The best that a professor
|can hope to accomplish in an introductory research methods class is to
|endow their students with a healthy skepticism of all other's research
|conclusions.
|
|We on the list have extensive examples of quite learned colleagues who
|don't always agree with each other. To hope to endow students in a
|single class or even a series of classes with the statistical and
|methodological sophistication to be able to grapple with problems at
|the level Herman suggests seems unlikely to me. Not to say that it is
|not a worthwhile goal, just that the time to accomplish such ends
|are not present in the circumstances which Marsh describes.

My exchange with Herman brought up two other points.  First, it ought not
be such a big deal for social science students to learn to use statistics
correctly.  If all social science undergraduates took at least one quantitative
course per semester, starting where high school usually leaves off
(calculus) and going on from there, they would have ample background in the
underlying mathematics and perhaps practical aspects of applications to do
what Herman is advocating, especially if this undergraduate work were
followed by a year of methods courses in graduate school.  Unfortunately,
most social science students today seem to approach mathematics as if it
were something like being left-handed: if you're not born with an aptitude
for math, don't even try learning it.  The sad part is that our educational
system not only lets them get away with that attitude, but also encourages
it.  If a knowing math at the level that engineering students know math
helps make someone a good social scientist, then we should demand it of our
students IN ADDITION to all the other stuff they need to know.

The second point follows on this.  During the 20th century statistics
unfortunately evolved hand-in-hand with "systematic empiricism,"  an
approach Willer and Willer label "pseudo-science."  Putting a statistician
who sees "variables" where there are historical social processes in a
policy-making capacity should not comfort us to expect better policies.
Good social policy takes at least three ingredients: good theory, good
analysis, and the political power to do good.  I'd argue that because the
social system we live under constrains policy to preclude the latter,
it matters little that social science graduates often lack the skills to
provide one or both of the first two ingredients.  Applied social science
graduates are not like brain surgeons who, if they make a mistake,
will fail to save lives.  They're more like used car dealers who put a nice
gloss on a social system that creates the problems the students ostensibly
are hired to solve.  Since the students aren't hired to throw out the
social system, there's little they can accomplish no matter what they
know.  Uh Oh, looks like I'm getting political, which seems off-topic here.
Just read Piven and Cloward's _Regulating the Poor_ to get my point.


6.4.2 Topics to Cover
---------------------

Neil Henry, favors separating statistics and computerized data analysis
into two separate courses.  Of course, being the only one course in
the curriculum focusing on social scientific knowledge means my course
must combine not only these two subjects, but research methods and epistemology
to boot!  Still, pedagogically Neal makes a good case:

|I've been ambivalent for years about combining computer
|instruction and statistics education into a single course. Principles
|of inferential statistics (confidence intervals, p-values, probability
|statements) don't require a statistical package built for data
|analysis. Examples which work well to illustrate the underlying theory
|do not provide the realism necessary to prepare a student for thesis
|research. The danger is that we reduce statistics to where to look on
|the output for the right number. So ideally I would have two courses,
|one on logic and methods, and a second on data analysis using real
|data.
|
|At the undergraduate introductory level I would like to introduce
|more of a sense of play: computer as random number generator and
|grapher rather than number cruncher. (This conflicts with the
|current ideology of education as job training) There is not much
|need for t-tests or anovas until somebody has some exposure to
|research design and the nitty gritty of data creation.

Stephen Laffey had the following to say about what data analysis tools
planners need.  I've interspersed some of my own comments with Steve's.

|Planners have to be able to optimize and analyze so I would be tempted
|to focus on regression, MANOVA and linear programming.  The deductive
|and inductive approaches to science, hypothesis testing, simple
|correlation.  While SAS is not friendly, it would be fairly simple to set up
|a logical sequence of examples to demonstrate how deductive science is based
|on the Hypothesis test and tests of association and models to explain
|interaction in the urban environment.

I also like planning students to learn linear programming, not so much for
its use in actual optimization but rather for its lessons.  Duality in
linear programming teaches students that even the most structured problem
can be looked at two different ways and that goals are sometimes best achieved
by attacking the constraints rather than accepting constraints as given and
mindlessly trying to achieve given objectives.  The notion of shadow prices
is to my mind one of the great intellectual achievements of the 20th
century (alright, so Morishima credits Marx with being hot on the trail 100
years before Von Neumann).  I do think shadow prices have some implications
for my postings and Jim Devine's comments on the "social factory" on
PEN-L.

Nonetheless, there's an awkward fit between research methods, data
analysis, and linear programming.  The first two deal with the process of
discovery; the last, problem solving.  Given the constraints I work under,
linear programming just doesn't seem to fit in anywhere.

|I am into transportation and spend a lot of time with TIGER & STF-3 data.  If
|you have ARC, or another GIS, you might be able to add the visual element to
|statistics by using ARC to plot the results of your SAS outputs.  Maybe using
|census tracts as your common spatial denominator throughout the class will
|assist in the demonstration of association, difference and relationship ideas
|across statistics and GIS.

SAS also has basic mapping graphics.  So, although we do have access to
ARC/INFO, I'll probably stay with SAS.


7 Topics and Issues in Social Science Research Methods Course
---------------------------------------------------------------

This question was more general:

>What topics do you feel should be covered in a social science (planning)
>research methods course?  Why?  What is the best way to teach these?

Alan Zaslavsky writes:

|I teach a large undergraduate introductory stat course (about 2/3 economics
|concentrators, the rest from a miscellany of government, areas studies,
|biological sciences, etc.).  I try to include more of a section on research
|methodology (particularly about the design of experiments, surveys, and
|observational studies ...  partly as a way of getting at the issues of
|causality and representativeness that these study designs address) than can
|be found in any introductory stat book I have seen.

|I used to teach an intro stat course for Anthropology grad students,
|and only during my second year did I find out that this was not only
|their only required statistics but actually the only concrete research
|methods course they got ... all the rest was highfalutin' theory.

|I ask the students in my undergrad course to do a short project.  Some
|of them do fairly nice small surveys or analyses of economic data sets,
|mostly based on public sources.  I am happy if they at least show an
|awareness of some of the issues in design and interpretation of these
|studies, even though the scope is small.  Perhaps (if you are able to
|do this within the structure of your program) it would make sense for
|you to adopt a similar standpoint, i.e. to put less emphasis on
|actually finding a thesis topic and more on carrying through a more
|limited piece of work with good attention to methodology.  Of course,
|even the most novice grad student has a better knowledge base than
|almost all undergraduates.

Hari Srinivas gave the following course outline:

|============<:A:>==========================
|* Elements of Social research:
|     - the research process
|     - causation
|     - theory in social research
|     - research design
|
|============<:B:>==========================
|* Research types for urban planners:
|     - exploratory research > comparative study
|     - evaluation study > control group study
|     - testing > case study
|* Objectives and purposes of research
|* Multi-dimensional/multi-level approaches
|* Research methods: description, diagnosis, analysis
|
|============<:C:>==========================
|* Primary and secondary research categories
|* Sources of information
|* document (book, report, article) interpretation
|* Techniques of commenting on available data
|* Flow chart/critical path of research
|* Definition, hypothesis, theory, model
|
|============<:D:>==========================
|* Design of a research project
|     - selection of parameters
|     - identification of related variables
|     - identification of target groups
|     - co-ordination of complex/simple variables
|     - differentiation between qualitative/
|       quantitative variables
|
|============<:E:>==========================
|* Survey questionnaire construction:
|     - wording v/s charting questions
|       (eg.socio-economic & demographics data)
|     - formulation of questions, design of charts,
|       directions, incentives, sequence of topics
|     - organization of survey conduct
|     - selection of operational/nominal definitions
|
|============<:F:>==========================
|* Other forms of data collection:
|     - interviews
|     - observations
|     - experiments/tests
|     - discussions
|     - postal/telephone surveys
|     - documentation and content analysis
|     - rapid appraisal
|     - structured conservation
|     - action research
|     - photographic documentation
|     etc......
|
|============<:G:>==========================
|* Sample size estimation
|* Sampling procedure
|* Scaling techniques
|* Statistical tests
|* Multidimensional tables
|
|============<:H:>==========================
|* Interview guidelines
|* Editing
|* Data processing
|* Data Evaluation
|* Data interpretation
|
|HOW?....................................
|
|Best would be in groups, each covering one aspect/issue of the research
|and then everybody sharing experiences at a "master" presentation.
|The campus itself can be used as the research area (saves time and effort -
|easier to conduct too):
|
|* Example -> "Bicycles on campus".
|
|     - what is the problem being studied?
|     - what are the issues involved?
|       (what,when,where,which,why,who,how)
|     - what are the research objectives?
|     - what hypotheses can be proposed?
|     - what are the parameters/variables to be
|       covered?
|     - what data will be needed?
|     - what is the best way to gather it?
|     - How can the data be collated and analyzed?
|     - How can the data be interpreted?
|     - What will the major conclusions?
|     etc.............
|
|Besides, your idea of "sharing" readings mentioned in Supplementary reading
|also sounds good and can easily be incorporated here too. A lot more gets
|done that way!

Lots of good stuff here, Hari.  I'll unashamedly steal your ideas.


8 My Plan for the Course
--------------------------

And now the envelope.  Here's how I decided to structure the course.
First, I'm dividing it up into three components: the class
lecture/discussion sessions, the lab, and individual student work.  Second,
grades are based on a scoring system of up to 2,000 points.  All students
are required to do tasks whose total value is 1,000 points, and everything
beyond that is extra credit.  If a student scores 850 points, that's a B,
so people who want A's will have to do some extra credit.

8.1 The Data Analysis Lab
-------------------------
The lab is most straight forward.  I am using two textbooks:

        Judd, Charles M. and Gary H. McClelland. 1989. _Data Analysis: A
                model-comparison approach_.  San Diego: Harcourt Brace
                Jovanovich.
        
        SAS Institute, Inc. c1992. _Getting Started with the SAS System
                using SAS/ASSIST Software, Version 6.  Cary: SAS
                Institute.

I really like the Judd book.  It's introductory in the sense that it
presumes no previous statistics knowledge, but it's pretty sophisticated
and doesn't waste a lot of time with trivia (like histograms vs ogives).  It
starts out by explaining all data analysis can be reduced to:

        DATA = MODEL + ERROR

and the goal of data analysis is to develop parsimonious MODELS with
minimal ERROR.  From there it goes into ways of measuring ERROR and relates
them to so-called descriptive statistics.  Then it's into sampling
distributions to determine if ERROR results from chance, and from there
into ANOVA and regression.  After the first 147 pages, it goes into
multivariate regression, outliers, and other advanced topics.  The book
also uses no math beyond algebra and provides relevant examples of output
from statistical packages, with SAS being one of the examples in most
instances.

What I intend to do is to develop 16 separate assignments for the lab.
Assignments 2-16 correspond to Chapters 2-16 in Judd.  Each assignment is
worth at least 50 points, although extra-credit questions will allow higher
grades on assignments.  The extra-credit questions I plan to ask are tough,
and I won't grade easy, so extra credit will be earned.  For example, Judd
gives a proof that the mean is the best unbiased estimator when the
sum-of-squares is used as the criterion for "best."  For extra credit, I
plan to ask them to prove that the median is best when the sum of absolute
deviations is the criterion.  Students are required to finish the first 7
assignments and they get extra credit for every additional assignment.

There will be no deadlines, but students will be limited to turning in at
most one assignment in any given calendar week (I have to make allowances
for the 16th assignment).  That way slower students can take all semester
on Chapters 1-7 in Judd (approximately 147 dense pages), or knock off all
16 chapters over a weekend.

The first assignment will introduce them to SAS ASSIST by having them log
on, start SAS/ASSIST, go through the tutorial, and then go through the
_Getting Started_ book.  Also as part of this assignment, they will
initialize a SAS data set using data used in Judd.  Then each subsequent
assignment will consist of (1) replicating any computer output reproduced
in Judd, (2) answering the questions in the corresponding chapter in Judd,
and (3) doing some more contrived computer exercises that I give them.

Since this will be the second time these students have seen introductory
statistics, I will expect them to do most of the work themselves.  I will
be available in lab, and they can use the SAS documentation and SAS ASSIST
help.  They will work in competency-based study groups to master the
material in Judd, but the assignments will be done individually.

8.2 Class Sessions
------------------

The class sessions will be organized around two textbooks:

        Andranovich, Gregory and Gerry Riposa. 1993. _Doing urban
                research_.  Newbury Park: Sage.

        Babbie, Earl. 1992. _The practice of social research_, 6th ed.
                Belmont CA: Wadsworth.

        
The lecture/discussion part of the course will be organized around the
subjects covered in Babbie.  These include (my terminology):

        - Epistemology
        - Research Design and Planning
        - Data Collection
        - Data Analysis
        - Use and Misuse of Research

Since the lab covers data analysis, we will not cover Babbie's discussion
of this subject.  I plan to begin with Babbie's first three chapters on
social research, the scientific method, and causality.  I plan to have some
supplements for "jigsawing" some issues raised in his overview.  At the
start of the semester I will also give out a list of books with positions
critical of the positivist account of science.  The students will each
select one book and prepare a book review due by mid-semester.  I will
encourage them not to start reading the book until we finish Babbie's
chapter on causality during roughly the third week.

The books will be grouped into three categories: easy, intermediate, and
difficult (black diamond).  Students may select a book from any one category,
corresponding to a maximum of 100, 150, or 200 points towards their final
grade.  The category for any individual book is determined by both its length
and inherent difficulty.  Examples of books in each category are:

        Easy (100 points):
                Barry Hindess.  _The Use of Official Statistics in Sociology_.
                C. Wright Mills. _The Sociological Imagination_.

        Intermediate (150 points):
                Keat and Urry. _Social Theory as Science_.
                Andrew Sayer. _Method in Social Science: A realist
                        Approach_.

        Difficult (200 points):
                Herbert Marcuse. _One Dimensional Man_.
                Michel Foucault. _The Order of Things_.
                
Further suggestions along these lines will be greatly appreciated.

In addition, there will be approximately 10 class assignments.  Most of
these will require a 1-2 page written report from the student.
Approximately half of the assignments will be done in groups, and the group
will often turn in a collective product.  The assignments will follow those
suggested in Babbie and in Andranovich.  The latter basically takes the
students through six sequential steps in researching an urban issue of
local interest.  The groups will do these six steps.  The other assignments
will be designed around Babbie and supplemental readings.  Since we will
jigsaw in class about once a week, students will be responsible for about
one supplemental reading about as often.  At least part of the assignment will
require something based heavily on the reading.  Each assignment will be
worth 45 points.


8.3 Individual Work
-------------------

In addition to labs and class sessions, each student will be required to
keep a log.  The purpose of the log is to both synthesize the course's
different threads and to provide an explicit check on understanding.
I will collect the logs three times during the semester and grade them.
The log will be worth 100 points.

Beyond the logs, the students can do a number of optional individual
projects.  The number of points associated with each project depends on its
difficulty.  The projects I am contemplating right now include:

        A second book review (see points above).
        An annotated bibliography (150 points).
        A review of the literature (200 points).
                points)
        A full-blown thesis proposal (350 points).
        A research paper on some method and its use in planning research
        (400 points)

These will be due finals week.

To assist in writing thesis proposals, I am recommending:

   Locke. 1993. _Proposals that Work_. Sage.


9 Conclusion
--------------

Once again, let me thank all of you for your comments and suggestions.  If
you have any further comments or suggestions on my proposed syllabus,
please let me know.  I hope you find this summary as useful as I found your
input.

Marsh Feldman
Community Planning                      Phone: 401/792-2248
204 Rodman Hall                           FAX: 401/792-4395
University of Rhode Island           Internet: [EMAIL PROTECTED]
Kingston, RI 02881-0815

Reply via email to