Hi Lyn,
there is an embargo patch see
http://wiki.dspace.org/index.php/User:Emetsger:Embargo
Hope that helps
Claudia Jürgen
Lyn Amery schrieb:
Hi all,
I know it's possible to restrict access to an item in DSpace, but is it
possible to do
so only until a specified date?
Hi,
Have you looked at the embargo on bitstream at
http://wiki.dspace.org/index.php/Embargo_on_Bitstream_v2_(JSP)
http://wiki.dspace.org/index.php/Embargo_on_Bitstream_v2_(JSP) ?
Obi
Fra: Lyn Amery [mailto:lyn.am...@sro.wa.gov.au]
Sendt: 14. januar 2009 08:39
Hello,
I was following the wiki instruction on how to build DSpace 1.5.1 but
came across this issue:
[#compile]
Reactor Summary:
DSpace Addon Modules
Daniel,
A quick question:
Are you building from the DSpace Assembly and Configuration project
that Netbeans creates (this project corresponds to the 'dspace/'
subfolder)? You *must* build from that project the first time, as it
will initialize and build all the other projects.
When I right
Is there something simple I can place in the jsp that will prohibit
the crawlers from
using my server resources?
TIA,
Jeff
Jeffrey Trimble
Systems Librarian
Maag Library
Youngstown State University
330-941-2483 (Office)
jtrim...@cc.ysu.edu
http://www.maag.ysu.edu
http://digital.maag.ysu.edu
Hello DSpacers,
I am quite desperate to resolve the problem I am having on debian running
any kind of admin function within DSpace. I get the exception shown below.
One way is to try to create a community but it seems like it happens when I
try to create anything. The so-called missing class is
Jeff:
We had an issue with our local google instance crawling our DSpace
installation and causing huge issues. I re-wrote the robots.txt to
disallow anything besides the item pages themselves - no browsing
pages or search pages and whatnot. Here is a copy of ours:
User-agent: *
Disallow:
As of DSpace 1.5, sitemaps are supported which allow search engines to
selectively crawl only new items, while massively reducing the server
load:
http://www.dspace.org/1_5_1Documentation/ch03.html#N10B44
Unfortunately, it seems that relatively few DSpace instances actually
use this feature.
I
Our DSpace server has LDAP and Password authentication enabled. Is
there an easy way to modify the Manakin Reference theme's login
links to point directly to http://server/ldap-login rather than the
login chooser at http://server/login? I need to leave password
authentication enabled on the
Jeff:
What I am using is a robots.txt file that I put in the dspace webapps
directory in tomcat. I think it's working (at least we haven't
crashed lately). If you're interested in seeing my robots.txt file,
I can send it to you.
At 01:09 PM 1/14/2009, Jeffrey Trimble wrote:
Is there
On Wed, 14 Jan 2009, Shane Beers wrote:
We had an issue with our local google instance crawling our DSpace
installation and causing huge issues. I re-wrote the robots.txt to disallow
anything besides the item pages themselves - no browsing pages or search
pages
and whatnot. Here is a
We don't use the data in either the history or historystate tables,
however they both have over a million rows in them. If I delete all
rows in both tables, is this going to cause us any problems? It doesn't
look like referential integrity is going to be a problem if I run a SQL
query to do
On Wed, Jan 14, 2009 at 9:30 PM, Thornton, Susan M. (LARC-B702)[NCI
INFORMATION SYSTEMS] susan.m.thorn...@nasa.gov wrote:
The error message is actually looking for WebContinuation$UserObject in
org/apache/cocoon/components/flow/*JAVASCRIPT*/fom and your grep found it
in
This would be a good opportunity to construct a reasonably good default
robots.txt file and add it to the documentation set.
At http://ses.library.usyd.edu.au/robots.txt, I have the following:
User-agent: *
Crawl-Delay: 11
Disallow: /browse
Disallow: /browse?
Disallow: /browse-title
I have created a journals community with over 1,400 sub-communities (phew!).
These sub-communities are the journal titles. Within each sub-community
there will be sub-sub-communities, one per issue (an issue will be a
collection). When viewing the journals community it takes ages to load the
page
The error message is actually looking for WebContinuation$UserObject in
org/apache/cocoon/components/flow/JAVASCRIPT/fom and your grep found it
in
org/apache/cocoon/components/flow/JAVASC/fom/FOM_WebContinuation$UserObj
ect.class (I've put the difference in ALL CAPS).
Hope this helps!
Sue
Andrew,
Try exposing the value of CLASSPATH in the context of where java is called for
DSpace. Locate your .jar file and check its path is listed in CLASSPATH.
I find this command useful for showing all the gory details of java and dspace
/bin/ps -ww -Heo cmd |grep java
-- Van Ly :
I ended up using a Java web filter called UrlRewriteFilter to redirect
requests for /login to /ldap-login (http://tuckey.org/urlrewrite/),
which is similar to using mod_rewrite in Apache. This solves my
immediate need, but I'd still be interested to know if there is
another way to do this,
Hi Stuart,
Stuart Lewis [sdl] wrote:
Hi Tom,
I have my handle server up and running, it¹s responding over the
necessary
ports. But it¹s not able to get the actual handles from my Oracle
database.
The database box is located across a firewall, and we¹ve already opened
a
port
so that
19 matches
Mail list logo