For your info.

STEPHEN B. ALAYON
Data Bank Senior Information Assistant
Library and Data Banking Services Section
Training and Information Division
Aquaculture Department (AQD)
Southeast Asian Fisheries Development Center (SEAFDEC)
Tigbauan, Iloilo 5021 Philippines
URL: http://www.seafdec.org.ph
Telephone No.: 63 33 5119170 to 71 local 344
Fax No.: 63 33 5119174
Mobile Phone No.: 63 919 4506688
Email Add: [email protected], [email protected]<%[email protected]>

----- Forwarded Message -----
*From:* Flor Lacanilao <[email protected]>
*To:* Stephen B. Alayon <[email protected]>
***Sent:* Wednesday, April 13, 2011 7:41 AM
*Subject:* Fw: [PAARL] commentaries on Philippine Science, Academic Reform,
ISI publications, Open Access


Stephen,

Your disseminating further the discussions on problems of science in the
Philippines is not only a good idea but also speaks well for AQD. Meron lang
akong ida-dagdag na comments pa for sharing with other egroups.

About the ranking of universities, those using only performance indicators
(like the Shanghai Jiao Tong University’s Academic Ranking of World
Universities) are more respected and helpful for developing countries --
criteria used are objective, internationally accepted, and easily
accessible. Hence, we can learn from them to improve performance. In this
ranking, no Philippine university has yet made the ranking.

On the other hand, university ranking using largely expert opinion and
capability indicators (e.g., student-faculty ratio), like the THES-QS
rankings, are hardly useful for developing country. Besides, expert opinion
has been criticized.

   “We criticize the recent expert-based rankings by showing that the
correlation of expert score and bibliometric outcomes is practically zero.
This finding casts severe doubts on the reliability of these expert-based
rankings” (Leiden University, The Netherlands, 2005).

In the THES-QS rankings, 4 Philippine universities made the rankings. This
is also true of the aborted Asia Magazine ranking of Asian universities,
where the some Philippine universities also met the poor criteria used.

On the use of ISI-indexed journals, note that not all journals in ISI’s
Master List are ISI-indexed; they are not indexed in ISI products, but
appear in indexes carried (hosted) only by Thomson ISI. Examples of these
indexes are *Biological Abstracts* and *Zoological Record*. These have no
strict criteria for journal coverage; and should not be mistaken for
ISI-indexed.

Maari bang i-share mo din itong paper ko, "Basic problems in Philippine
science and education" (see below). This provides the data-support to my
Inquirer Commentary, *Democratic governance impedes academic
reform<http://opinion.inquirer.net/inquireropinion/columns/view/20110314-325325/Democratic-governance-impedes-academic-reform>.
*This paper reviews the state of Philippine science and education.

Thanks.
Flor

--------------------

*Basic problems in Philippine science and education*

By Flor Lacanilao, February 2011

Producing a bigger workforce and putting in more money have been the usual
answers to address the poor state of Philippine science and education. The
lack of workforce and money has been the convenient excuse for poor work
output or outright failure to do the job. An analysis of the situation,
however, does not point to them as the real causes of this problem. It is
failure to attend to the basic causes of the problem that has led to the
continued deterioration of our science and education situation (see "*Only
science can solve
poverty<http://opinion.inquirer.net/inquireropinion/columns/view/20070621-72442/Only_science_can_solve_poverty>
*").

A major culprit is in the performance evaluation using personal judgment by
unpublished and poorly published officials and faculty members -- instead of
objective, internationally accepted criteria.

Consider the predicament of the *National Science
Consortium<http://www.philstar.com/Article.aspx?articleId=658044&publicationSubCategoryId=75>
*, which has been put up by the country’s seven top universities and the
Science Education Institute of the Department of Science and Technology
(DOST). Its objectives are (1) enhance the capability of the higher
education system by producing technically competent PhD and MS graduates and
(2) address the lack of Filipino researchers to enable our country to
compete economically with its neighbors (see "*Time to abandon
GDP<http://www.nature.com/nature/journal/v463/n7283/full/463849b.html>
*").

The Consortium cites UNESCO figures showing the Philippines with 7,500
researchers in 2009 against Singapore’s 28,000. Last year, at the
international conference in Japan on teaching and research activities, a
report on scientific publications of 10 Asian countries showed the
Philippines with the least published papers.

The Philippines had only 178 valid publications in 2005, whereas tiny
Singapore had 3,609, or 20 times more. Since Singapore’s 28,000 researchers
were 4 times that of the Philippines’ 7,500, and Singapore’s research output
was 3,609 against our 178, then Singapore researchers were 5 times more
productive than their Philippine counterparts. How did this happen?

*Failed programs*

Data in the last 3 decades show that although the country’s number of
researchers has been increasing, properly published papers per PhD even
decreased; indeed, the national output hardly increased. A lot of the
research funds went to unpublished or poorly published researchers who
produced unpublished or poorly published papers (without adequate per
review) -- in short, gray literature. Gray literature is not taken
seriously, and it doesn’t count in international evaluations of research
performance, as the above-cited study of publications in 10 Asian countries.
It doesn’t contribute to development.
(See "*Continuing problems with gray
literature<http://www.springerlink.com/content/n30l1m04106256mg/>
*.")

At UP Diliman, the newly established College of Science in 1983 had aimed at
an all-PhD faculty. It succeeded in doubling the number to over 90 PhDs in
10 years. But the number of properly published papers decreased. It dropped
from 24 to 15 percent of total publications, or from 12 to only 5 percent of
the papers produced per PhD. This means that for every 20 papers, only 1
counted in international evaluations of S&T performance. (For example, see “
*Science in 100 years of UP
biology<http://josecarilloforum.com/pdf/Science%20in%20100%20years%20of%20UP%20biology.pdf>
.*”)

Further, the DOST launched the Science and Technology Master Plan in
1990-2000. Its R&D budget had increased yearly in 1991-1995, a four-fold
growth from P800 million to over P3 billion. In 1992-1998, it implemented
the Engineering and Science Education Project (ESEP). This was to upgrade
engineering and science through PhD and MS scholarships. “If one surveys
local universities today, one will find that many of the leaders were ESEP
graduates,” a report said.

Yet with all the money, effort, and years spent in those programs, the
country's S&T hardly improved. The research output remained the same from
1981 to 1995. And the total of Philippine publications in leading journals
even decreased in 2000 to 2005 -- from a mere 185 down to 178, reported by
the above-cited study of publications in Asia; China, South Korea,
Singapore, and Thailand with about two-fold increase in publications during
the same period. And China more than doubling its output from 2004 to 2008 (see
“*Global Research Report:
China<http://researchanalytics.thomsonreuters.com/m/pdfs/grr-china-nov09.pdf>
*“).

Hence, the programs largely succeeded only in increasing the number of poor
mentors and decreasing the general quality of graduates (for example, see
the “*Scientific Productivity of Selected Universities in the
Philippines<http://ched-zrc.dlsu.edu.ph/pdfs/valencia2004.pdf>
*”).

There are 764 PhD faculty members from our top universities that are
involved in the Consortium program. Granting they produced all of the
country's 178 publications in 2005, their research productivity is only 0.23
per PhD. This is far below that of the National University of Singapore,
where its 154 PhDs in science produced 389 publications in 1994, or 2.5 per
PhD. This is more than 10 times than the output of our best graduate
faculties. (World-class performance is 1 per PhD per year.) How can they be
expected to properly mentor the projected 250 PhD and 350 MS graduates
yearly? (See “*Training graduate
students<http://www.philstar.com/Article.aspx?articleId=326514&publicationSubCategoryId=75>
.*”)

The programs cited above have failed because unpublished and poorly
published officials and faculty members have relied on personal judgment,
when they evaluate research proposals and publications, screen faculty
applicants, and select candidates for promotions, recognitions, and awards.
**

*Effective systems *

There are objective and internationally accepted criteria for performance
evaluation ("*The scientific impact of
nations<http://www.nature.com/nature/journal/v430/n6997/full/430311a.html>
"*). When implemented with cash rewards for outstanding publications, these
criteria greatly increased useful research output. At the UP, where a
P55,000 reward has been given per published paper in an international
journal, publications increased from 25 to 40 percent of the national total
between 1997-99 and 2002. (The combined publication output of La Salle,
Ateneo, UST, and San Carlos during the same period increased from only 7.8
to 8.0 percent of the national total; the rest was largely produced by the
International Rice Research Institute in Los Baños with only about 60
PhDs.)

At the Southeast Asian Fisheries Development Center (SEAFDEC) in Iloilo,
which offered a cash incentive of 50% of annual salary, publications of the
50 all-Filipino research staff -- with only 9 PhDs -- increased sevenfold in
1993 after 6 years. In fast-developing countries like China and Brazil,
other forms of incentives have significantly increased published papers in
international journals. (See “*Celebrating the UP
Centennial<http://www.bahaykuboresearch.net/index.php?module=article&view=52>
*.”)

With an incentive system that uses objective, internationally accepted
criteria, it would be possible for the Philippines to produce the desired
output, save on research funds, justify even higher R&D budgets, and find a
viable way to really reforming its science and higher education. Program
funds should therefore be available for rewards on properly published papers
and support for proposals of published proponents. Accepted criteria are
journals covered in Thomson ISI indexes. Important journals are
covered in *Science
Citation 
Index<http://science.thomsonreuters.com/cgi-bin/jrnlst/jloptions.cgi?PC=K>
* and *Social Sciences Citation
Index<http://science.thomsonreuters.com/cgi-bin/jrnlst/jloptions.cgi?PC=SS>
*. These are internationally acknowledged indicators of academic
performance.

For its part, the Commission of Higher Education (CHED) should do away with
putting up research journals; instead, it should encourage researchers to
publish in journals covered in *Thomson ISI
indexes<http://www.sciencemag.org/cgi/content/full/325/5943/920>
*,like the two mentioned above.

CHED is supporting 190 state universities and colleges where only 10 percent
of the faculty members have PhD degrees. Worse yet, how many of them are
properly published? Can they effectively manage research journals or review
manuscripts (see “*Scientific Publishing
Standards<http://www.sciencemag.org/cgi/content/full/321/5894/1271>
*”)? Proliferation of such-managed journals must stop.

The objective performance indicators and the incentive system will minimize
subjective evaluations by nonscientists, fix the other wrong research
practices, improve the performance of the 7,500 researchers, and produce
better-qualified mentors and instructors for graduate and undergraduate
students. And provide better teachers for primary and secondary education
(see an example on how our academic scientists are solving persistent
problems in basic education in “*Education reform amid
scarcity<http://opinion.inquirer.net/inquireropinion/letterstotheeditor/view/20101013-297598/Education-reform-amid-scarcity>
*”).

As Carl Wieman, Nobel laureate in physics, has observed, it is doubtful that
great progress can be made at the primary and secondary levels until a
higher standard of science learning is set at the post-secondary level (see
(“*Reinventing science
education<http://www.taipeitimes.com/News/editorials/archives/2008/02/15/2003401383>
*”).

In sum, the Philippines should radically reform its approaches in solving
problems to ensure the achievement of its objectives.* It is not about the
advancement of science just for science’s sake. Rather, it is about
advancing science in the context of a desire to improve the human condition.
This entails attention to the processes by which understandings from the
natural sciences, the social sciences, and engineering influence—or fail to
influence—public policy* ("S&T for Sustainable
Well-Being<http://www.sciencemag.org/cgi/reprint/319/5862/424.pdf>
").

The Philippines will then have a chance of catching up with its more
progressive neighboring countries in science, education, and national pro
gress. There is "*No shortcut to
progress<http://opinion.inquirer.net/inquireropinion/letterstotheeditor/view/20080219-119740/No_shortcut_to_progress>
*".

Florlaca

-- 
You received this message because you are subscribed to the Google Groups 
"Filipino Librarians" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/filipinolibrarians?hl=en.

Reply via email to