Re: [Dri-devel] future of DRI?
On Fri, Feb 28, 2003 at 10:29:04PM -0800, Ian Romanick wrote: > Daniel Vogel wrote: > > Does DRI have a future with neither NVIDIA nor ATI participating? > > Are people actually talking to them about why they don't use it and > > what has to be done to remedy this fact? Shouldn't this be a top priority? > > > > > > To clarify: I meant what has to be done to make DRI (direct rendering > > *infrastructure*) attractive for IHVs. I didn't mean to imply what has to be > > done to get NVIDIA or ATI to release open source drivers and whatnot. > > The uncanny thing is that this is almost EXACTLY my job description. :) How about writing documentation for the kernel-level API then? It's not very attractive to write to an API that doesnt have any official documentation. --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] future of DRI?
Daniel Vogel wrote: Does DRI have a future with neither NVIDIA nor ATI participating? Are people actually talking to them about why they don't use it and what has to be done to remedy this fact? Shouldn't this be a top priority? To clarify: I meant what has to be done to make DRI (direct rendering *infrastructure*) attractive for IHVs. I didn't mean to imply what has to be done to get NVIDIA or ATI to release open source drivers and whatnot. The uncanny thing is that this is almost EXACTLY my job description. :) The open source/ closed source discussion has been beaten to death and is irrelevant to this thread. My point was/is that without NVIDIA or ATI using the DRI infrastructure it is doomed to fail. Right now both ATI and Kyro are using the infrastructure, but they both has their own internal enhancements. That's why both drivers install their own libGL.so. I'm trying to beef up the GLX support in the open-source libGL.so so that these IHVs no longer need to do this. What we need to do is open a dialogue with the IHVs to find out what they need. The problem is that there isn't really an authoritative body to have such a discussion. Tungsten probably could do it, and IHVs would probably respond to them, but I think Tungsten already has its hands full with projects that pay the rent. :) --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] How to add new functionality to libGL
Felix Kühling wrote: Hello, I just started working on a revision of the DRI Configuration design doc based on the feedback I received. As Brian suggested I want to implement the functionality for acquiring available configuration options in libGL. I had a look at xc/lib/GL/dri and it looks as if dri_glx.[ch] would be the right place. Is that correct? Actually, you should probably look in xc/lib/GL/glx/glxcmd.c. I would add a new function that returns the options. Programs would get a pointer to this function via glXGetProcAddress. You'd have to add the function to the table in glxcmd.c. The stuff in xc/lib/GL/dri gets compiled into the client-side driver. The stuff in xc/lib/GL/glx gets compiled into libGL.so. How would the new functions be exported to client applications? They are obviously not declared in any standard header files. An interested client would also have to do some version checking in order to test whether the new functions are really available. And it would have to check if libGL is from DRI in the first place. How would all this work? Programs would know whether or not the function exists by the return value of glXGetProcAddress. If it returns NULL, it ain't there. :) --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
[Dri-devel] Re - Your medications 1518PePY-8
Ct0I6f
Re: [Dri-devel] future of DRI?
On Fri, 28 Feb 2003 14:57:35 -0800 Philip Brown <[EMAIL PROTECTED]> wrote: > On Fri, Feb 28, 2003 at 05:15:58PM -0500, Daniel Vogel wrote: > > > > So, are there technical reasons for NVIDIA not to use the DRI if ATI does? > > > > yes. > > NVIDIA already has their own cross-platform low level driver, with a > cross-platform 3d API. It's their "UDI", Unified Driver Interface, > or something like that. > > So if they switched to using DRI, they would then be looking at rewriting > their own crossplatform(?) opengl implementation that currently > nicely slots onto their UDI. Has anyone put any thought into a 'migration' kit of sorts for DRI, so that IHVs can easily port their Windows drivers to DRI? Is it possible? Would people use it? More importantly, is there sufficient community support for it? It certainly would provide a fair bit of impetus (if well publicized) for manufacturers to release drivers for DRI, if it required a minimum of effort. On the other hand, it would discourage them from releasing specs, which might be a very undesirable consequence. I guess in the end it depends on what the aim is -- more 3D drivers under Linux that use DRI, or fewer ones that are open source Comments? David Bronaugh pgp0.pgp Description: PGP signature
[Dri-devel] Forget waiting in a doctors office all day
Title: AD 3 IT'S NEVER BEEN EASIER TO ORDER YOUR PRESCRIPTIONS ONLINE We are the largest U.S. Company specializing in Prescriptions for VIAGRA & other FDA approved medications. Order from the comfort of your own home ** Free online consultation ** Fully confidential ** No appointments Discreet & Private** Shipped via FedEx ** Click here to get Viagra and other Meds. THE ABSOLUTE LOWEST PRICES. If you do not wish to receive further mailings, click here 0390iRlS6-163Rwxb1629l20
[Dri-devel] Will work for free
--- "Mike A. Harris" <[EMAIL PROTECTED]> wrote: > >> I don't see 100 unpaid hackers hacking feverishly Since you have the specs, tell me how to reset a Rage128 from protected mode so that I can add it to the framebuffer driver. I know about going into real mode and calling C000:3. This can't be done from a device driver. vbios.vm86 does it from the command line and it's a 500K program. My application calls for multiple Rage128 in a single machine. Only the first one gets reset by the BIOS at power on. I need to know what register to poke to reset the card, how to set up it's RAM controller, and whatever else is needed to do a reset. I even tried disassembling the VBIOS to figure this out. The necessary info is part of the source of the VBIOS ROM. Tell me the sequence needed and this unpaid hacker will add a reset function to the Rage128 FB driver for free. = Jon Smirl [EMAIL PROTECTED] __ Do you Yahoo!? Yahoo! Tax Center - forms, calculators, tips, more http://taxes.yahoo.com/ --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Re: future of DRI?
On Fri, 28 Feb 2003 15:56:41 -0500 (EST) "Mike A. Harris" <[EMAIL PROTECTED]> wrote: > > I don't see 100 unpaid hackers hacking feverishly on anything X > related right now. Why would 100 unpaid hackers come out of the > woodwork all of a sudden? Quite unrealistic. If you stood a chance in hell of writing a COMPLETE driver, I recon dozens of people would come out of the woodwork. without specs though, it'll never happen. --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] TEXMEM , I can find that only in CVS? And fordebian?
On Fre, 2003-02-28 at 21:51, AnonimoVeneziano wrote: > Hi all, I want to know where I can fine The texmem trunk of DRI, > possibly in debian packages. And if there aren't debian packages the > other ways ;-) . I'm not aware of any Debian packages of the texmem branch, I spend quite some time maintaining the packages for the trunk and the mach64 branch already. As for getting the branch from CVS, see http://dri.sourceforge.net/doc/cvs.html . -- Earthling Michel Dänzer (MrCooper)/ Debian GNU/Linux (powerpc) developer XFree86 and DRI project member / CS student, Free Software enthusiast --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] RE: future of DRI?
As Mike implied, NV doesn't do the DRI precisely BECAUSE fragmentation isn't good. Their primary interest is a single <> code base for <> and not a single unfragmented framework for 3D accelerated XFree86!! Mike Mike A. Harris wrote: On Fri, 28 Feb 2003, Daniel Vogel wrote: Fragmention still isn't good, which brings me back to my original question whether folks are talking to NVIDIA why they aren't using the DRI framework. I'm sure if Nvidia wanted to use DRI they would do so. What benefit would there be to Nvidia really of ditching their existing infrastructure which is closed source, and switching both their kernel side and userland side code to closed source code which uses the DRI infrastructure? Their code is also shared between Windows and I believe Macintosh, and DRI is not available on those platforms. I don't see exactly what you mean by fragmentation. 2 vendors using closed source code aren't fragmenting anything except for their own internal interests. The drivers are a black box really, and could be using any kind of interface, be it DRI, or some proprietary solution. It neither helps nor hinders the DRI project really. --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
[Dri-devel] RE: future of DRI?
On Fri, 28 Feb 2003, Daniel Vogel wrote: >Fragmention still isn't good, which brings me back to my >original question whether folks are talking to NVIDIA why they >aren't using the DRI framework. I'm sure if Nvidia wanted to use DRI they would do so. What benefit would there be to Nvidia really of ditching their existing infrastructure which is closed source, and switching both their kernel side and userland side code to closed source code which uses the DRI infrastructure? Their code is also shared between Windows and I believe Macintosh, and DRI is not available on those platforms. I don't see exactly what you mean by fragmentation. 2 vendors using closed source code aren't fragmenting anything except for their own internal interests. The drivers are a black box really, and could be using any kind of interface, be it DRI, or some proprietary solution. It neither helps nor hinders the DRI project really. -- Mike A. Harris ftp://people.redhat.com/mharris OS Systems Engineer - XFree86 maintainer - Red Hat --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
RE: [Dri-devel] RE: future of DRI?
> NVidia wanted to keep the source code base of the Windows drivers and the > Linux drivers as close as possible, including what would be considered > kernel mode stuff. They started with windows drivers and adapted that to > linux. Part of their porting effort was bulding a kernel level wrapper, > which "emulated" the minimum win32 kernel service API's the rest of the > kernel module needed. I'm always amused by the reasons people come up with for things like this... Note: In no way am I speaking officially as an employee of NVIDIA Corporation. -- Gareth Hughes ([EMAIL PROTECTED]) OpenGL Developer, NVIDIA Corporation --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] How to add new functionality to libGL
Felix Kühling wrote: Hello, I just started working on a revision of the DRI Configuration design doc based on the feedback I received. As Brian suggested I want to implement the functionality for acquiring available configuration options in libGL. I had a look at xc/lib/GL/dri and it looks as if dri_glx.[ch] would be the right place. Is that correct? How would the new functions be exported to client applications? They are obviously not declared in any standard header files. An interested client would also have to do some version checking in order to test whether the new functions are really available. And it would have to check if libGL is from DRI in the first place. How would all this work? All this would not be an issue if we limited access to this functionality to one command line programme that is distributed with DRI. Every application that is interested in the DRI configuration would have to use that programme. But in that case there's not much point implementing this functionality in a *shared* library. Am I missing something here? I thought libGL would have a function, like __glXGetOptions(Display *dpy, int screen) that'd you'd access via dlopen()/dlsym(). __glXGetOptions, in turn, would use the usual DRI mechanism to identify the right driver, open the driver with dlopen() and get the driver's GetOptions() function with dlsym(), and call it. Something like that. -Brian --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Getting to a 3D base...
On Fre, 2003-02-28 at 19:03, Jon Smirl wrote: > It is a fact that Microsoft Longhorn and the Mac GUI > are moving towards 3D hardware for their base GUI. It > is also a fact that it will take a lot of effort and > probably several years to move X in the same > direction. Personally I don't want to see Linux in the > position of having the new 3D effects in Windows and > the Mac and not being able to respond. I'm willing to > put some time and effort into this and maybe other are > too. > > I don't buy the "support old hardware" argument for > stopping the forward progress of DRI. With that > argument we'd still all be running in X86 real mode > and the Athlon-64 would be pointless. Backwards > compatibility is important and fallbacks should be > provided, but it's not a reason for stopping the use > of new hardware features. > > So what is the best design for achieving this? The > project has to have DRI at it's core since it's the > only choice for 3D acceleration on Linux. We also have > to preserve X Windows compatibility for all of the > existing apps. > > My first thought would be to get DRI running > standalone (like the fbdri project but sync'd to CVS) > then modify X to load on top of the standalone DRI. > After that works rework the 2D X driver to use the DRI > API instead of using the hardware directly. > > The reasoning behind a standalone DRI is to make it > easier to build a 3D windowing system. To create a 3D > windowing system a mechanism is needed to get existing > X windows into textures so that they can be > transformed. If it proves too hard to alter X, > standalone DRI would allow X to be replaced with > something like NanoX. Have you looked at DirectFB (http://directfb.org/)? Integrating the DRI with that might be interesting, they already have an X server which maps X windows to DirectFB surfaces (mostly the same way as is done on Mac OS X). When I tried XDirectFB a while ago, it was amazingly snappy even without any hardware acceleration. -- Earthling Michel Dänzer (MrCooper)/ Debian GNU/Linux (powerpc) developer XFree86 and DRI project member / CS student, Free Software enthusiast --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] RE: future of DRI?
NVidia wanted to keep the source code base of the Windows drivers and the Linux drivers as close as possible, including what would be considered kernel mode stuff. They started with windows drivers and adapted that to linux. Part of their porting effort was bulding a kernel level wrapper, which "emulated" the minimum win32 kernel service API's the rest of the kernel module needed. On Fri, 28 Feb 2003, Daniel Vogel wrote: > Fragmention still isn't good, which brings me back to my original question > whether folks are talking to NVIDIA why they aren't using the DRI framework. > > -- Daniel, Epic Games Inc. > > > > --- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > ___ > Dri-devel mailing list > [EMAIL PROTECTED] > https://lists.sourceforge.net/lists/listinfo/dri-devel > --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] RE: future of DRI?
On Fri, Feb 28, 2003 at 05:33:29PM -0500, Daniel Vogel wrote: > Fragmention still isn't good, which brings me back to my original question > whether folks are talking to NVIDIA why they aren't using the DRI framework. Probably because of theirs UDA? I suspect it is easear to support 'more' common source base which is also under total vendor control. There also multiple accelerated heads issue with current DRI.. arkadi. --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
[Dri-devel] Re: future of DRI?
On Fri, 28 Feb 2003, Jon Smirl wrote: >> I don't see 100 unpaid hackers hacking feverishly on >> anything X > >Obviously you wouldn't see 100 people working full >time Obviously. Not even part time. I doubt you'd see more than 20-30 people "working" on it at all. And by that I mean making any significant contributions, not a oneshot bug fix. I mirror what Gareth Hughes said. There aren't hundreds of performance obsessed hackers itching to hack on DRI source code, wether it is the existing open source code, or wether the sources for both ATI, and Nvidia's drivers were released. In fact, I'll go one step further, and say that I believe if ATI and Nvidia *both* released the _entire_ source code to their drivers, and published their complete video hardware specifications on the front page of slashdot, that practically zero work would be done with any of it except perhaps by existing people already members of the DRI project, or hacking on DRI a fair amount on this list already. I bet it would draw next to zero new developers. I also doubt that any performance improvements would be made to those currently closed source drivers any time soon. More likely, the code would be used to improve all video drivers, including the Matrox ones, and other video hardware as well. That would take a LONG time to do too. It would probably take enough time for experienced DRI people alone to even get a hold on the full driver source code base enough to make use of it. >but you might get detailed bug reports or patches from 100 >people. I know I get patches from all over the place for code I >have written. Detailed bug reports? LOL. 95% of all XFree86 related bug reports from people are pure useless garbage. Missing tonnes of information, and quite often you have to dig the information out of the person, and they get increasingly frustrated with you and upset and flame you so you don't even want to look at their problem anymore. Patches? Not many. A few patches come in (specifically related to DRI that is), yes, but they are very few (IMHO) compared to the work the DRI and XFree86 projects do. Quite frankly, the collective code of the kernel interfaces, DRI, Mesa, etc. used in implementing open source 3D is quite complicated in nature, and there aren't 100's of people out there IMHO who grok it enough to be able to make as many contributions as you might think. Having more code out there doesn't mean there are going to be more people hacking on it. >A couple of months ago I wanted to make a few changes to the ATI >Rage128 framebuffer driver. It took me a month to get ATI to >give me the register specs. I still can't get the 3D spec and >this is for five year old hardware. I fixed a couple of bugs in >the framebuffer. I just left the others alone since I can't get >them to tell me how to reset the card from protected mode. You might not have been able to get them, but many of us do have them. While it would be nice if hardware vendors would release specs openly, it is their decision. The fact that they allow ANYONE to have them is FANTASTIC, or we would have little to no open source drivers at all. Be thankful to those companies that do allow some developers to have access to documentation. >I just don't understand what is to be gained by keeping the >Rage128 hardware programming spec secret. After all a device >driver for a board has the best copy protection in the world. That's their business really. I've got the Rage 128 specs, and many other X developers do too. If other people can demonstrate their abilities to work on X or DRI, and show enough initiative to work on the code *without* the specs, they are much more likely to get them. I know several people who hack on the ATI drivers who do NOT have the specs, and they manage to do pretty good without them all things considered. ATI would probably give them the specs based on their track record if they asked. >We don't know if NVidia or ATI have incorporated 3rd >party code into their drviers. I am pretty sure that it has been acknowledged that they have. >There are other solutions. >1) ATI could simply open source their hardware spec >and let us write the drivers To be honest.. ATI provided many developers with Radeon R200 specifications prior to the hardware even being released. I don't know specifically how many people had them or have them now, but I do know that most if not all of the developers who have them, have full time jobs, and do not have time to spend 8 hours a day working on video drivers. The largest development efforts will come from people who can afford to spend hours every day working on them, and that is only most likely going to happen if it is funded development work under contract, such as the work that Tunsten Graphics is doing for The Weather Channel right now. Or the Intel i8x0 driver work that's been done. There is open source work being done by volunt
Re: [Dri-devel] future of DRI?
On Fri, Feb 28, 2003 at 05:15:58PM -0500, Daniel Vogel wrote: > > So, are there technical reasons for NVIDIA not to use the DRI if ATI does? > yes. NVIDIA already has their own cross-platform low level driver, with a cross-platform 3d API. It's their "UDI", Unified Driver Interface, or something like that. So if they switched to using DRI, they would then be looking at rewriting their own crossplatform(?) opengl implementation that currently nicely slots onto their UDI. --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Using DRI to implement 2D X drivers
On Fre, 2003-02-28 at 23:11, Philip Brown wrote: > On Fri, Feb 28, 2003 at 05:06:15PM +0100, Michel Dänzer wrote: > > > I haven't look at this but if the DRM modules know > > > about setting up the hardware and changing resolutions > > > then there may be no need for framebuffer any more. > > > You could write a generic framebuffer driver that was > > > implemented in terms of the DRM interface. But this > > > wasn't part of the intial idea. > > > > But what's the point, instead of simply using the framebuffer device, > > which has been established and is needed for console on many > > architectures? > > is your definition of "many architectures" == "many variants of linux" ? Many ports of Linux, yes. I would have said 'platform' had I meant multiple OSs. -- Earthling Michel Dänzer (MrCooper)/ Debian GNU/Linux (powerpc) developer XFree86 and DRI project member / CS student, Free Software enthusiast --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
[Dri-devel] RE: future of DRI?
> Well... for starters ATI *is* using the DRI infrastructure. > Does that mean that you think DRI is doomed to success now? I guess it means that it's at least not fundamentaly flawed ;-) Fragmention still isn't good, which brings me back to my original question whether folks are talking to NVIDIA why they aren't using the DRI framework. -- Daniel, Epic Games Inc. --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
[Dri-devel] RE: future of DRI?
On Fri, 28 Feb 2003, Daniel Vogel wrote: >> Does DRI have a future with neither NVIDIA nor ATI participating? >> Are people actually talking to them about why they don't use it and >> what has to be done to remedy this fact? Shouldn't this be a top priority? > >To clarify: I meant what has to be done to make DRI (direct >rendering *infrastructure*) attractive for IHVs. I didn't mean >to imply what has to be done to get NVIDIA or ATI to release >open source drivers and whatnot. > >The open source/ closed source discussion has been beaten to >death and is irrelevant to this thread. > >My point was/is that without NVIDIA or ATI using the DRI >infrastructure it is doomed to fail. Well... for starters ATI *is* using the DRI infrastructure. Does that mean that you think DRI is doomed to success now? -- Mike A. Harris ftp://people.redhat.com/mharris OS Systems Engineer - XFree86 maintainer - Red Hat --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
[Dri-devel] DRI trunk 4.3.0 merge?
Now that XFree86 4.3.0 is tagged/released, is there a plan for merging 4.3.0 into the DRI trunk? -- Leif Delgass http://www.retinalburn.net --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
RE: [Dri-devel] future of DRI?
> > My point was/is that without NVIDIA or ATI using the DRI > > infrastructure it is doomed to fail. > > Uhmmm... ATI *does* use the DRI infrastructure for their drivers. Googled for it a while but couldn't find any hints that they do so I assumed they don't. Thanks for the clarification. So, are there technical reasons for NVIDIA not to use the DRI if ATI does? -- Daniel, Epic Games Inc. --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Using DRI to implement 2D X drivers
On Fri, Feb 28, 2003 at 05:06:15PM +0100, Michel Dänzer wrote: > > I haven't look at this but if the DRM modules know > > about setting up the hardware and changing resolutions > > then there may be no need for framebuffer any more. > > You could write a generic framebuffer driver that was > > implemented in terms of the DRM interface. But this > > wasn't part of the intial idea. > > But what's the point, instead of simply using the framebuffer device, > which has been established and is needed for console on many > architectures? is your definition of "many architectures" == "many variants of linux" ? --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] future of DRI?
On Fri, Feb 28, 2003 at 09:45:26PM +, José Fonseca wrote: > > Even if DRI stops being the main source of 3D drivers for Linux/BSD, it > will remain to be the main source of _open_source_ 3D drivers. That, > alone, gives DRI a competitive advantage over any other solution. Just > in the same way that it has given Linux, Apache, XFree86, and many other > successful OSS projects. all those other successful open source projects, had open (or at least, known) hardware to work with. It's not the same equasion in the 3d graphics field. > Bare in mind that OSS developers don't need absolute market dominance of > the software they write. They obviously enjoy that what they write is > widely used - it means that is useful -, but they don't depend on it. True. --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
[Dri-devel] How to add new functionality to libGL
Hello, I just started working on a revision of the DRI Configuration design doc based on the feedback I received. As Brian suggested I want to implement the functionality for acquiring available configuration options in libGL. I had a look at xc/lib/GL/dri and it looks as if dri_glx.[ch] would be the right place. Is that correct? How would the new functions be exported to client applications? They are obviously not declared in any standard header files. An interested client would also have to do some version checking in order to test whether the new functions are really available. And it would have to check if libGL is from DRI in the first place. How would all this work? All this would not be an issue if we limited access to this functionality to one command line programme that is distributed with DRI. Every application that is interested in the DRI configuration would have to use that programme. But in that case there's not much point implementing this functionality in a *shared* library. Am I missing something here? Regards, Felix __\|/_____ ___ ___ __Tschüß___\_6 6_/___/__ \___/__ \___/___\___You can do anything,___ _Felix___\Ä/\ \_\ \_\ \__U___just not everything [EMAIL PROTECTED]>o<__/ \___/ \___/at the same time! --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
RE: [Dri-devel] future of DRI?
On Fri, 28 Feb 2003, Daniel Vogel wrote: > > Does DRI have a future with neither NVIDIA nor ATI participating? > > Are people actually talking to them about why they don't use it and > > what has to be done to remedy this fact? Shouldn't this be a top priority? > > To clarify: I meant what has to be done to make DRI (direct rendering > *infrastructure*) attractive for IHVs. I didn't mean to imply what has to be > done to get NVIDIA or ATI to release open source drivers and whatnot. > > The open source/ closed source discussion has been beaten to death and is > irrelevant to this thread. > > My point was/is that without NVIDIA or ATI using the DRI infrastructure it > is doomed to fail. Uhmmm... ATI *does* use the DRI infrastructure for their drivers. Adam --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] future of DRI?
On Fri, Feb 28, 2003 at 03:00:03PM -0500, Daniel Vogel wrote: > > So what is the best design for achieving this? The > > project has to have DRI at it's core since it's the > > only choice for 3D acceleration on Linux. > > Ironically, the only real choice for 3D acceleration on Linux is using > NVIDIA and ATI's (non DRI) binary drivers. AFAIK that is incorrect: ATI drivers are DRI based. They only aren't Mesa based. NVIDIA is the only one which has a proprietary solution altogether. > Does DRI have a future with neither NVIDIA nor ATI participating? Are > people actually talking to them about why they don't use it and what > has to be done to remedy this fact? Shouldn't this be a top priority? > > Without support for recent cards, DRI will become completely obsolete. This is non-sense. Please enumerate the IHVs that participated on early Linux development. By your thinking Linux shouldn't exist as it was doomed from the start. Regarding the trend of the linux drivers being written by the manufacturer, I think it's mainly a sign that Linux is becoming a market important enough to dedicate developers to it. They are just doing what they do for all other platforms. With this I'm not saying I'm happy with this trend. Of course I'm not. I don't like to be locked in closed-source solutions, which I can't control. But I'm sure that if there was to avoid this trend, that would have happened. Even if DRI stops being the main source of 3D drivers for Linux/BSD, it will remain to be the main source of _open_source_ 3D drivers. That, alone, gives DRI a competitive advantage over any other solution. Just in the same way that it has given Linux, Apache, XFree86, and many other successful OSS projects. Bare in mind that OSS developers don't need absolute market dominance of the software they write. They obviously enjoy that what they write is widely used - it means that is useful -, but they don't depend on it. José Fonseca __ Do You Yahoo!? Everything you'll ever need on one web page from News and Sport to Email and Music Charts http://uk.my.yahoo.com --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
RE: [Dri-devel] future of DRI?
> Does DRI have a future with neither NVIDIA nor ATI participating? > Are people actually talking to them about why they don't use it and > what has to be done to remedy this fact? Shouldn't this be a top priority? To clarify: I meant what has to be done to make DRI (direct rendering *infrastructure*) attractive for IHVs. I didn't mean to imply what has to be done to get NVIDIA or ATI to release open source drivers and whatnot. The open source/ closed source discussion has been beaten to death and is irrelevant to this thread. My point was/is that without NVIDIA or ATI using the DRI infrastructure it is doomed to fail. -- Daniel, Epic Games Inc. --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Getting to a 3D base...
On Fri, 28 Feb 2003 10:03:10 -0800 (PST) Jon Smirl <[EMAIL PROTECTED]> wrote: > It is a fact that Microsoft Longhorn and the Mac GUI > are moving towards 3D hardware for their base GUI. It > is also a fact that it will take a lot of effort and > probably several years to move X in the same > direction. Personally I don't want to see Linux in the > position of having the new 3D effects in Windows and > the Mac and not being able to respond. I'm willing to > put some time and effort into this and maybe other are > too. > > I don't buy the "support old hardware" argument for > stopping the forward progress of DRI. With that > argument we'd still all be running in X86 real mode > and the Athlon-64 would be pointless.he Backwards > compatibility is important and fallbacks should be > provided, but it's not a reason for stopping the use > of new hardware features. There is a reason why I would *not* like to have the 3D as the base for next generation of Xservers, and that is stability. Current DRI drivers are not stable as 3d drivers only, whereas 2d accelerated drivers are rock solid (as least for the cards I have tried). I have Linux+XFree86 on academic environments for many years now and have always liked the reliability factor: buggy 2d applications don't crash the Xserver or kernel, the whole system lives happily for weeks without rebooting. None of this is true with the 3d drivers. Please don't take this a flame bait (I know this is the DRI devel list), but I would concentrate on good, stable, old fashioned 3d acceleration before trying to race Apple and MS on the "sexiest 3d window-system" contest. Pedro. Pedro. --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
[Dri-devel] Re: future of DRI?
--- "Mike A. Harris" <[EMAIL PROTECTED]> wrote: > On Fri, 28 Feb 2003, Jon Smirl wrote: > > I don't see 100 unpaid hackers hacking feverishly on > anything X Obviously you wouldn't see 100 people working full time but you might get detailed bug reports or patches from 100 people. I know I get patches from all over the place for code I have written. A couple of months ago I wanted to make a few changes to the ATI Rage128 framebuffer driver. It took me a month to get ATI to give me the register specs. I still can't get the 3D spec and this is for five year old hardware. I fixed a couple of bugs in the framebuffer. I just left the others alone since I can't get them to tell me how to reset the card from protected mode. I just don't understand what is to be gained by keeping the Rage128 hardware programming spec secret. After all a device driver for a board has the best copy protection in the world. But in ATI's defense they are much better than Nvidia. I gave away my NVidia hardware and I won't be buying any more. > While I would love nothing more than to see that, I > highly doubt > that either ATI or Nvidia have the legal rights to > open source > the entire source code of all of their drivers. > Parts of them > perhaps, but I doubt all of them, and even then > parts of them > would be patent encumbered still. We don't know if NVidia or ATI have incorporated 3rd party code into their drviers. There are other solutions. 1) ATI could simply open source their hardware spec and let us write the drivers 2) ATI could shift resources and contribute to the DRI code base instead of working on their own. 3) ATI can license their patents for royalty free use when developing drivers for their hardware. Other uses would require fees, like the new W3C patent position. Again, no knock on ATI. They are doing the best of the bunch. = Jon Smirl [EMAIL PROTECTED] __ Do you Yahoo!? Yahoo! Tax Center - forms, calculators, tips, more http://taxes.yahoo.com/ --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] future of DRI?
Jon Smirl wrote: > > I really don't understand ATI's position on Linux > drivers. They have better hardware but they are losing > because of their drivers. I can't think of a better > solution than having a couple hundred highly skilled, > performance obsessed, unpaid hackers fixing their code > for them. Somehow, I dont' think this is an entirely accurate description of the situation. When I was heavily involved in the DRI project, there were significant contributions made by perhaps a dozen people, many of whom worked for PI/VA (now TG). By the looks of things, the list of active developers has grown some, which is great to see, but there has never been anything approaching "a couple hundred highly skilled, performance obsessed, unpaid hackers" just waiting to "fix their code for them". -- Gareth --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
[Dri-devel] TEXMEM , I can find that only in CVS? And for debian?
Hi all, I want to know where I can fine The texmem trunk of DRI, possibly in debian packages. And if there aren't debian packages the other ways ;-) . Anyone can tell me the known bugs that actually these drivers have? Thanks Bye --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
[Dri-devel] Re: future of DRI?
On Fri, 28 Feb 2003, Jon Smirl wrote: >> Does DRI have a future with neither NVIDIA nor ATI >> participating? > >I really don't understand ATI's position on Linux >drivers. They have better hardware but they are losing >because of their drivers. I can't think of a better >solution than having a couple hundred highly skilled, >performance obsessed, unpaid hackers fixing their code >for them. This is not an argument against such a possible source code release, but I'm merely trying to be realistic here. I don't see 100 unpaid hackers hacking feverishly on anything X related right now. Why would 100 unpaid hackers come out of the woodwork all of a sudden? Quite unrealistic. What would really happen, would be the existing DRI team members, and others closely working on DRI _right_now_ would look at it likely, and probably work on it, and possibly a small number of new people too. I doubt there would be even 5 new people touching the code if that. >ATI and Nvidia have both disassembled each other's >drivers so there are no secrets between the two. Is there evidence to support this claim, or is it merely conjecture? >Another argument could be that a another chip manufacturer could >clone a chip and use the drivers for free. I don't see that >happening given the complexity of the chips. Another argument is that the binary drivers of both companies most likely contain 3rd party intellectual property, and patented techniques, some of which they themselves may have patented, and others which they have possibly licensed from other companies, perhaps many companies. Would all of those companies want to have their IP instantly open sourced and free for all to see/use? While I would love nothing more than to see that, I highly doubt that either ATI or Nvidia have the legal rights to open source the entire source code of all of their drivers. Parts of them perhaps, but I doubt all of them, and even then parts of them would be patent encumbered still. >Finally ATI could be afraid of patent issues by opening the >source. But this would just make the patent issues slightly >easier to find, they'd still have the same problems with closed >source. If they were being accused the lawyers would get the >source anyway. Not ATI - any vendor. You assume perhaps that there are perhaps patents being violated and these companies want to hide something, and by opening their code they could get sued. A more likely hypothesis is that they have licensed patented technology themselves and have the right to use it in their products, drivers, etc. but do not have the right to give it to other people or distribute the source code. That is not at all unrealistic to assume. There is also the threat that the code might contain something unknowingly is patented, such as a technique totally invented on one's own, that just coincidentally was invented by someone else first and patented. Opening up source code can cause other companies to jump in and see which of their patents you might have violated so they can sue you. Again, this isn't an argument against opening of such code, but rather an attempt to explain some real life reasons why some companies do not do so. >What about bad fixes to the code? ATI can just control the CVS >and only apply patches that they are happy with. ATI should >continue with their paid developers, just make their changes >public too. I'm sure if there weren't other reasons already that this would not be a problem. >Linux programmers like to fix things when they are broken. I >just removed the ATI radeon drivers from my system and went back >to the DRI ones. About once a day the ATI driver will lock up. >If I had the source I would have poked around and tried to fix >it. Without the source I threw the drivers in the trash. I agree with you, and many people on this list would want to twiddle with the source as well. We're the minority however. Most people do not want to hack on source code of software that doesn't work for them, even fewer on device drivers, and even fewer on video drivers. >ATI is really underestimating the skills of some of the hackers >in the Linux community. Some really good code comes out of >those long Siberian/Finnish winters. They are also missing the >opportunity to lead in a fast growing market. Just look of the >cover of Business Week. I completely disagree here. For the above paragraph to be true, ATI would have to be sitting there in Markham thinking "Well, we would release the source code of our binary drivers, but the open source community isn't talented enough, so we wont." Not only do I think that is not the case of what they think, I think it is not even in the equation. I have no idea what ATI, Nvidia, Kyro, or any other video hardware vendor's explicitly detailed reasons are for not releasing the complete source code to their drivers is, however I do understand enough to be able to piece together several legal and o
Re: [Dri-devel] future of DRI?
--- Daniel Vogel <[EMAIL PROTECTED]> wrote: > Does DRI have a future with neither NVIDIA nor ATI > participating? I really don't understand ATI's position on Linux drivers. They have better hardware but they are losing because of their drivers. I can't think of a better solution than having a couple hundred highly skilled, performance obsessed, unpaid hackers fixing their code for them. ATI and Nvidia have both disassembled each other's drivers so there are no secrets between the two. Another argument could be that a another chip manufacturer could clone a chip and use the drivers for free. I don't see that happening given the complexity of the chips. Finally ATI could be afraid of patent issues by opening the source. But this would just make the patent issues slightly easier to find, they'd still have the same problems with closed source. If they were being accused the lawyers would get the source anyway. What about bad fixes to the code? ATI can just control the CVS and only apply patches that they are happy with. ATI should continue with their paid developers, just make their changes public too. Linux programmers like to fix things when they are broken. I just removed the ATI radeon drivers from my system and went back to the DRI ones. About once a day the ATI driver will lock up. If I had the source I would have poked around and tried to fix it. Without the source I threw the drivers in the trash. ATI is really underestimating the skills of some of the hackers in the Linux community. Some really good code comes out of those long Siberian/Finnish winters. They are also missing the opportunity to lead in a fast growing market. Just look of the cover of Business Week. = Jon Smirl [EMAIL PROTECTED] __ Do you Yahoo!? Yahoo! Tax Center - forms, calculators, tips, more http://taxes.yahoo.com/ --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
[Dri-devel] future of DRI?
> So what is the best design for achieving this? The > project has to have DRI at it's core since it's the > only choice for 3D acceleration on Linux. Ironically, the only real choice for 3D acceleration on Linux is using NVIDIA and ATI's (non DRI) binary drivers. Does DRI have a future with neither NVIDIA nor ATI participating? Are people actually talking to them about why they don't use it and what has to be done to remedy this fact? Shouldn't this be a top priority? Without support for recent cards, DRI will become completely obsolete. -- Daniel, Epic Games Inc. --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
[Dri-devel] Getting to a 3D base...
It is a fact that Microsoft Longhorn and the Mac GUI are moving towards 3D hardware for their base GUI. It is also a fact that it will take a lot of effort and probably several years to move X in the same direction. Personally I don't want to see Linux in the position of having the new 3D effects in Windows and the Mac and not being able to respond. I'm willing to put some time and effort into this and maybe other are too. I don't buy the "support old hardware" argument for stopping the forward progress of DRI. With that argument we'd still all be running in X86 real mode and the Athlon-64 would be pointless. Backwards compatibility is important and fallbacks should be provided, but it's not a reason for stopping the use of new hardware features. So what is the best design for achieving this? The project has to have DRI at it's core since it's the only choice for 3D acceleration on Linux. We also have to preserve X Windows compatibility for all of the existing apps. My first thought would be to get DRI running standalone (like the fbdri project but sync'd to CVS) then modify X to load on top of the standalone DRI. After that works rework the 2D X driver to use the DRI API instead of using the hardware directly. The reasoning behind a standalone DRI is to make it easier to build a 3D windowing system. To create a 3D windowing system a mechanism is needed to get existing X windows into textures so that they can be transformed. If it proves too hard to alter X, standalone DRI would allow X to be replaced with something like NanoX. Another solution would be to leave everything as is. Then 3D enable the root window and start hacking on the existing X windowing system. = Jon Smirl [EMAIL PROTECTED] __ Do you Yahoo!? Yahoo! Tax Center - forms, calculators, tips, more http://taxes.yahoo.com/ --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Using DRI to implement 2D X drivers
On Fri, Feb 28, 2003 at 03:29:51PM +0100, Michel Dänzer wrote: > On Fre, 2003-02-28 at 10:11, Felix Kühling wrote: > > > > I think this discussion is getting off track. We have to make clear what > > we are talking about here. From the first mail on this subject I got the > > impression, the goal was > > > > - to implement accelerated 2D primitives using the 3D graphics engine. > > > > This makes a lot of sense, as each transition between usage of the 2D > > and 3D engine has to flush the graphics pipeline (at least on radeon). > > It would both, increase performance and make the interaction between > > Xserver and DRI clients potentially simpler. > > Maybe. I'm not sure if the 3D engine can reasonably accelerate the > traditional X primitives and meet its pixel perfect requirements though. > > Also, keep in mind that even the Radeon 3D drivers use the 2D engine for > things like texture uploads. Notice that chips like the permedia2/3 used their 3D engine for doing 2D rendering. Sure they were chips from 3Dlabs coming from the 3D world to the 2D one, but still, maybe such approach will become more common in the future. Friendly, Sven Luther --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
[Dri-devel] ictent prbduce mxa cjijw bl udtb
Dear Homeowner, Interest Rates are at their lowest point in 40 years! We help you find the best rate for your situation by matching your needs with hundreds of lenders! Home Improvement, Refinance, Second Mortgage, Home Equity Loans, and much, much more! You're eligible even with less than perfect credit! This service is 100% FREE to home owners and new home buyers without any obligation. Where others say NO, we say YES!!! Take just 2 minutes to complete the following form. There is no obligation, all information is kept strictly confidential, and you must be at least 18 years of age. Service is available within the United States only. This service is fast and free. We specialize in approving BAD CREDIT! http://www.3fia.com/mtg/ This email has been screened and filtered by our in house ""OPT-OUT"" system in compliance with state laws. If you wish to "OPT-OUT" from this mailing as well as the lists of thousands of other email providers please visit http://www.3fia.com/mtg/optout.html it anxziaow
Re: [Dri-devel] Using DRI to implement 2D X drivers
On Fri, Feb 28, 2003 at 09:25:56AM +0100, Sven Luther wrote: | On Thu, Feb 27, 2003 at 02:01:22PM -0800, Jon Smirl wrote: | > ... Moore's law | > means that everyone is going to have super 3D hardware | > in a couple of years. | | Even Embeded or handheld systems ? 3D-accelerated cell phones will start shipping in 2004. (According to an OpenGL-ES (embedded-systems subset) presentation given at the December ARB meeting. One of the companies working on this is 3d4W, www.3d4w.com .) Some graphics environments will be a lot more constrained than desktop systems, but the basic stuff will soon be available everywhere. Allen --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Using DRI to implement 2D X drivers
On Fri, Feb 28, 2003 at 03:04:08PM +, Ian Molton wrote: | On Thu, 27 Feb 2003 18:17:33 -0800 | Allen Akin <[EMAIL PROTECTED]> wrote: | | > | > Then there are the arguments for deeper color channels based on the | > need for higher-precision intermediate results -- for transparency, | | > antialiasing, multipass rendering, etc. These are compelling. | | no argument there. Apologies if I've misunderstood your comment, but it looks like you're saying that just because intermediate results might require higher precision than the final displayable colors, you don't want high-precision displayable color buffers. On the contrary, making the displayable color buffer high-precision saves time (no need to copy from a high-precision intermediate buffer to a low-precision displayable buffer) and space (you only need one high-precision buffer rather than one high-precision buffer plus one low-precision buffer). Similar arguments apply in other situations, like dynamically generating textures. Oh, and I should have mentioned high-dynamic-range color representations in the previous note. Once you've seen them, there's no going back to the old way. :-) Allen --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Using DRI to implement 2D X drivers
On Fre, 2003-02-28 at 17:02, Jon Smirl wrote: > --- Michel Dänzer <[EMAIL PROTECTED]> wrote: > > > It would be simple to lift the mode setting and > > > hardware identification code out of the fb drivers > > > > > > But what would be the advantage over leaving it as a > > framebuffer device > > or whatever in the first place? > > > The X philosophy is to ship a complete system for all > supported OS's. Moving the code from framebuffer to > DRM would remove the requirement for framebuffer to be > loaded. > > I haven't look at this but if the DRM modules know > about setting up the hardware and changing resolutions > then there may be no need for framebuffer any more. > You could write a generic framebuffer driver that was > implemented in terms of the DRM interface. But this > wasn't part of the intial idea. But what's the point, instead of simply using the framebuffer device, which has been established and is needed for console on many architectures? *shrug* I could see the DRM providing a wrapper interface for a framebuffer device, or other ways of cooperation between the two, but this seems a strange idea. Maybe that's just me though. -- Earthling Michel Dänzer (MrCooper)/ Debian GNU/Linux (powerpc) developer XFree86 and DRI project member / CS student, Free Software enthusiast --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Using DRI to implement 2D X drivers
--- Michel Dänzer <[EMAIL PROTECTED]> wrote: > > It would be simple to lift the mode setting and > > hardware identification code out of the fb drivers > > > But what would be the advantage over leaving it as a > framebuffer device > or whatever in the first place? > The X philosophy is to ship a complete system for all supported OS's. Moving the code from framebuffer to DRM would remove the requirement for framebuffer to be loaded. I haven't look at this but if the DRM modules know about setting up the hardware and changing resolutions then there may be no need for framebuffer any more. You could write a generic framebuffer driver that was implemented in terms of the DRM interface. But this wasn't part of the intial idea. = Jon Smirl [EMAIL PROTECTED] __ Do you Yahoo!? Yahoo! Tax Center - forms, calculators, tips, more http://taxes.yahoo.com/ --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Using DRI to implement 2D X drivers
On Thu, 27 Feb 2003 17:20:19 -0800 Ian Romanick <[EMAIL PROTECTED]> wrote: > 64-bpp or 128-bpp isn't useful for display, but > is useful. Since you're talking intermediate, yes, agreed. --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Using DRI to implement 2D X drivers
On Thu, 27 Feb 2003 18:17:33 -0800 Allen Akin <[EMAIL PROTECTED]> wrote: > > Then there are the arguments for deeper color channels based on the > need for higher-precision intermediate results -- for transparency, > antialiasing, multipass rendering, etc. These are compelling. no argument there. --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
[Dri-devel] hi gul0ghyhoColvwv1vrxufhirujh1qhw
It's me Jennifer, I just wanted to send you that pic you asked for the other day. Click Here to catch me on my webcam & see more pics of me. - xoxo Jennifer áËë^¨¥Ë)¢{(ç[É8bAzEÊ&zÚ yé!y«Þm§ÿí)äç¤r¿±ðëׯzYX§X¬´:âuëÞX¬¶Ë(º·~àzwÛi³ÿåËl²«qç讧zßåËlþX¬¶)ߣ÷kׯz
Re: [Dri-devel] Using DRI to implement 2D X drivers
On Fre, 2003-02-28 at 10:11, Felix Kühling wrote: > > I think this discussion is getting off track. We have to make clear what > we are talking about here. From the first mail on this subject I got the > impression, the goal was > > - to implement accelerated 2D primitives using the 3D graphics engine. > > This makes a lot of sense, as each transition between usage of the 2D > and 3D engine has to flush the graphics pipeline (at least on radeon). > It would both, increase performance and make the interaction between > Xserver and DRI clients potentially simpler. Maybe. I'm not sure if the 3D engine can reasonably accelerate the traditional X primitives and meet its pixel perfect requirements though. Also, keep in mind that even the Radeon 3D drivers use the 2D engine for things like texture uploads. -- Earthling Michel Dänzer (MrCooper)/ Debian GNU/Linux (powerpc) developer XFree86 and DRI project member / CS student, Free Software enthusiast --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Using DRI to implement 2D X drivers
On Don, 2003-02-27 at 20:52, Martin Spott wrote: > Michel D?nzer <[EMAIL PROTECTED]> wrote: > > > The radeon driver uses the DRM for 2D acceleration when DRI is enabled, > > Is the radeon driver the only one doing so ? I think all drivers supporting the DRI have to deal with 2D and 3D concurrency one way or the other, if you mean that. > Is it possible that heavy simultaneous use of 2D and 3D graphics is > responsible for the DRM freezing the X server with FlightGear ? You > remember, I get the stuff near to stable as long as FlightGear is the > only X client running on the display. It might be related. You could try disabling the 2D acceleration primitives with the XaaNo... options to see if that makes a difference. -- Earthling Michel Dänzer (MrCooper)/ Debian GNU/Linux (powerpc) developer XFree86 and DRI project member / CS student, Free Software enthusiast --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] GL image distortions with Radeon VE
Am Freitag, 28. Februar 2003 14:18 schrieb Felix Kühling: > On 28 Feb 2003 14:02:14 +0100 > > Michel Dänzer <[EMAIL PROTECTED]> wrote: > > On Fre, 2003-02-28 at 13:30, Felix Kühling wrote: > > > On 28 Feb 2003 12:00:48 GMT > > > > > > Martin Spott <[EMAIL PROTECTED]> wrote: > > > > Nick Kurshev <[EMAIL PROTECTED]> wrote: > > > > > Image distortion became visible in oct 2002. > > > > > > > > This probably was the time when hardware-TCL came into play !? > > > > You could try to set $RADEON_TCL_FORCE_DISABLE to 't' and see if it > > > > makes any difference, > > > > > > No, it won't. I had the same problem with my Radeon 7500 when I > > > disabled TCL. > > > > Besides, the VE doesn't have a TCL unit in the first place... > > > > > I submitted a patch for this problem two weeks ago. IIRC it was > > > applied at least to the trunk. Try a CVS upgrade. > > > > Actually, I haven't seen anything committed about this yet. Nick, you > > should try the patch Keith posted recently. > > You're right, I confused the SWTCL vertex corruption with the SWTCL > texture state problem. For the latter one there seems to be no propper > fix yet. I tried Keith's last patch, and it didn't solve some of the > problems that were fixed by more radical patches I proposed earlier. The > Quake3 problem is still there with Keith's patch. Can we have some more "unified corruption cases", please? I think your case could be the same as the one Charl's and I have with VTK. Keith's last patch didn't fix it neither. Regards, Dieter --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] GL image distortions with Radeon VE
On 28 Feb 2003 14:02:14 +0100 Michel Dänzer <[EMAIL PROTECTED]> wrote: > On Fre, 2003-02-28 at 13:30, Felix Kühling wrote: > > On 28 Feb 2003 12:00:48 GMT > > Martin Spott <[EMAIL PROTECTED]> wrote: > > > > > Nick Kurshev <[EMAIL PROTECTED]> wrote: > > > > > > > Image distortion became visible in oct 2002. > > > > > > This probably was the time when hardware-TCL came into play !? > > > You could try to set $RADEON_TCL_FORCE_DISABLE to 't' and see if it makes > > > any difference, > > > > No, it won't. I had the same problem with my Radeon 7500 when I disabled > > TCL. > > Besides, the VE doesn't have a TCL unit in the first place... > > > I submitted a patch for this problem two weeks ago. IIRC it was > > applied at least to the trunk. Try a CVS upgrade. > > Actually, I haven't seen anything committed about this yet. Nick, you > should try the patch Keith posted recently. You're right, I confused the SWTCL vertex corruption with the SWTCL texture state problem. For the latter one there seems to be no propper fix yet. I tried Keith's last patch, and it didn't solve some of the problems that were fixed by more radical patches I proposed earlier. The Quake3 problem is still there with Keith's patch. > > > -- > Earthling Michel Dänzer (MrCooper)/ Debian GNU/Linux (powerpc) developer > XFree86 and DRI project member / CS student, Free Software enthusiast Felix __\|/_____ ___ ___ __Tschüß___\_6 6_/___/__ \___/__ \___/___\___You can do anything,___ _Felix___\Ä/\ \_\ \_\ \__U___just not everything [EMAIL PROTECTED]>o<__/ \___/ \___/at the same time! --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Using DRI to implement 2D X drivers
On Don, 2003-02-27 at 23:01, Jon Smirl wrote: > -- Sven Luther <[EMAIL PROTECTED]> wrote: > > Notice that the DRI drivers don't do anything like > > mode setting and > > such, they depend on the X drivers for that. So if > > you take away the X > > driver, you will not be able to get anything > > outputed on your monitor. > > Unless you use the fbdev drivers for example. > > It would be simple to lift the mode setting and > hardware identification code out of the fb drivers and > add it to the DRM kernel driver. If you were still > using the 2D drivers the new code in DRM would just be > ignored. But what would be the advantage over leaving it as a framebuffer device or whatever in the first place? -- Earthling Michel Dänzer (MrCooper)/ Debian GNU/Linux (powerpc) developer XFree86 and DRI project member / CS student, Free Software enthusiast --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] GL image distortions with Radeon VE
On Fre, 2003-02-28 at 13:30, Felix Kühling wrote: > On 28 Feb 2003 12:00:48 GMT > Martin Spott <[EMAIL PROTECTED]> wrote: > > > Nick Kurshev <[EMAIL PROTECTED]> wrote: > > > > > Image distortion became visible in oct 2002. > > > > This probably was the time when hardware-TCL came into play !? > > You could try to set $RADEON_TCL_FORCE_DISABLE to 't' and see if it makes > > any difference, > > No, it won't. I had the same problem with my Radeon 7500 when I disabled > TCL. Besides, the VE doesn't have a TCL unit in the first place... > I submitted a patch for this problem two weeks ago. IIRC it was > applied at least to the trunk. Try a CVS upgrade. Actually, I haven't seen anything committed about this yet. Nick, you should try the patch Keith posted recently. -- Earthling Michel Dänzer (MrCooper)/ Debian GNU/Linux (powerpc) developer XFree86 and DRI project member / CS student, Free Software enthusiast --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Using DRI to implement 2D X drivers
On Fri, 2003-02-28 at 12:19, Sven Luther wrote: > So, No 2D windows on the face of rotating cubes ? Once your 2D windows are textures the rest is very much free, including scaling, rotation occlusion and alpha blending. You can use it to build the base X interfaces then worry about exposing the wackier bits to users. --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] GL image distortions with Radeon VE
On 28 Feb 2003 12:00:48 GMT Martin Spott <[EMAIL PROTECTED]> wrote: > Nick Kurshev <[EMAIL PROTECTED]> wrote: > > > Image distortion became visible in oct 2002. > > This probably was the time when hardware-TCL came into play !? > You could try to set $RADEON_TCL_FORCE_DISABLE to 't' and see if it makes > any difference, No, it won't. I had the same problem with my Radeon 7500 when I disabled TCL. I submitted a patch for this problem two weeks ago. IIRC it was applied at least to the trunk. Try a CVS upgrade. > > Martin. > -- > Unix _IS_ user friendly - it's just selective about who its friends are ! > -- Regards, Felix __\|/_____ ___ ___ __Tschüß___\_6 6_/___/__ \___/__ \___/___\___You can do anything,___ _Felix___\Ä/\ \_\ \_\ \__U___just not everything [EMAIL PROTECTED]>o<__/ \___/ \___/at the same time! --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Using DRI to implement 2D X drivers
On Fri, Feb 28, 2003 at 01:14:09PM +, Alan Cox wrote: > On Fri, 2003-02-28 at 08:25, Sven Luther wrote: > > Also, before you speak about unifying the 2D and 3D drivers > > you need to look at how a 3D desktop would work. > > I would assume roughly like the Apple renders seem to work now, or how > the opengl accelerated canvas works in E. That bit is hardly rocket > science. So, No 2D windows on the face of rotating cubes ? I was thinking more on the metaphor side of things than on the technical one when you ask the question. But sure a desktop task bar where each icon would be an animated 3D object floating a small distance on top of the taskbar would be neat, maybe the taskbar could even have an horizontal feel or something like that. BTW, i like the way Apple is doing icons that grow when you pass over them, i guess they will be using the same trick for the tabs in their new safari browser. Friendly, Sven Luther --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Using DRI to implement 2D X drivers
On Fri, 2003-02-28 at 08:25, Sven Luther wrote: > Also, before you speak about unifying the 2D and 3D drivers > you need to look at how a 3D desktop would work. I would assume roughly like the Apple renders seem to work now, or how the opengl accelerated canvas works in E. That bit is hardly rocket science. --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] GL image distortions with Radeon VE
Nick Kurshev <[EMAIL PROTECTED]> wrote: > Image distortion became visible in oct 2002. This probably was the time when hardware-TCL came into play !? You could try to set $RADEON_TCL_FORCE_DISABLE to 't' and see if it makes any difference, Martin. -- Unix _IS_ user friendly - it's just selective about who its friends are ! -- --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Using DRI to implement 2D X drivers
On Thu, Feb 27, 2003 at 06:04:36PM -0800, Jon Smirl wrote: > --- Ian Romanick <[EMAIL PROTECTED]> wrote: > > Let's see, XFree86 supports 2D for about 50 > > different chips, and it > > supports 3D for about 5. MS might be in a position > > to cast way support > > for older hardware, but I don't think that we are. > > > This is backwards thinking. In five years a Radeon > 9700 is going to cost $10 and be integrated into the > motherboard. Ok, let's look at it differently. Writing an accelerated 2D driver is quite easy. A weeks work at most if everything works out well and you have available docs. This would include Xv accel as well. Now, writing the 3D driver part is quite more difficult, and will not work on every OS XFree86 supports. Friendly, Sven luther --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
[Dri-devel] GL image distortions with Radeon VE
Hello! I've met this problem (see attach) a long ago but it seems that nobody fixed that :( This problem happens not only with this game but with quake3 too! It looks like every odd frame contains these black squares but every even frame is free from them that causes image flickering! These squares appear not in every scene but in most scenes of GL-117! Image distortion became visible in oct 2002. I guessed that XFree86-4.3.0 will be free from similar artifacts but unfortunatelly they are exist. (This screenshot was produced with today snapshot of DRI-CVS). Is it problem only of Radeon VE or is it well known problem? If there is any workaround then tell me PLEASE! WBR! Nick <> pgp0.pgp Description: PGP signature
Re: [Dri-devel] Using DRI to implement 2D X drivers
On Fri, 28 Feb 2003 04:39:58 +0100 Bernhard Kaindl <[EMAIL PROTECTED]> wrote: > On Thu, 27 Feb 2003, Jon Smirl wrote: > > > Long ago I loved the command line. I was an expert at > > it. When Window 1.0 came out I got my first exposure > > to a mouse. For about a year I wouldn't get one, but > > now I can't live without it. > > Similar for me. And as I've read about a 3D Window System, > my mind started to travel > > I was thinking that it would be good to have a 3D X Server > which can do the same things as todays X Servers do but also > supports an 3D windowmanager API and possibly an extended 3D > Application Window, or better Object API. > > Such advanced 3D windowmangers could then use these APIs to > arrange 2D Windows and 3D Windows within the three dimensional > world of the X-Server and of course the views into this world. > > Maybe even more such as brightness and color-correction. > > For example, the 3D windowmanager could then manage some > xterm and emacs windows for those people which can't live > without emacs or their favourite ASCII programs and of > course their favourite webrowser. I think this discussion is getting off track. We have to make clear what we are talking about here. From the first mail on this subject I got the impression, the goal was - to implement accelerated 2D primitives using the 3D graphics engine. This makes a lot of sense, as each transition between usage of the 2D and 3D engine has to flush the graphics pipeline (at least on radeon). It would both, increase performance and make the interaction between Xserver and DRI clients potentially simpler. Now it sounds more as if you wanted to propose something like X12 which adds a completely new API. I don't think there is a need for that. You have OpenGL and GLX for that. Someone correct me if I'm wrong, but my understanding is that you can, even now, use GLX in a window manager. Another important issue in this context is accelerated indirect rendering in order to allow untrusted local and remote clients to render hardware accelerated 3D graphics. At last let me point out, that one of the goals of the work Ian Romanick is doing on the texmem branches is to make mixed 2D-3D GUIs practical and efficient. These are my thoughts on the practical side of this. If people want to discuss bigger visions, go ahead ;) > > To let them live in the 3D world and give them already > some benefits of 3D, the X-Server could render the flat > output they produce not into flat framebuffer memory, but > into textures which are displayed on objects in the 3D > world and can be already transparent for example to give > some nice views. > > The real 3D windows would of course not need to be flat, > they could be externally as todays Mesa Applications look > external, but new 3D windows could of course be 3D objects > then. > > Where they appear and how you get them, destroy them, size > and move them would be handled by the 3D windowmanager then. > > Bernd > > PS: For initial 3D windows I think of 3D clocks, which you > can size, rotate and view them from a different angle with > different lightning and fog. > > Also minimized icons or desktop icons could be small 3D objects > hanging around behind, at the bottom of the screen or where > you want them. Far or near, bright or dark. __\|/_____ ___ ___ __Tschüß___\_6 6_/___/__ \___/__ \___/___\___You can do anything,___ _Felix___\Ä/\ \_\ \_\ \__U___just not everything [EMAIL PROTECTED]>o<__/ \___/ \___/at the same time! --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Using DRI to implement 2D X drivers
On Thu, Feb 27, 2003 at 02:01:22PM -0800, Jon Smirl wrote: > --- Sven Luther <[EMAIL PROTECTED]> wrote: > > Notice that the DRI drivers don't do anything like > > mode setting and > > such, they depend on the X drivers for that. So if > > you take away the X > > driver, you will not be able to get anything > > outputed on your monitor. > > Unless you use the fbdev drivers for example. > > It would be simple to lift the mode setting and > hardware identification code out of the fb drivers and > add it to the DRM kernel driver. If you were still > using the 2D drivers the new code in DRM would just be > ignored. Sure, there was a proposal to merge the fbdev and drm drivers, but the DRI people did not like it. One of the reasons being that fbdev is linux specific and the drm builds for more than 1 os, if i remember well. There is also DirectFB, which sits on top of fbdev, and has an X server running on top of it, Not DRI enabled i think though. > > Did you investigate the Berlin project, which, if i > > am not wrong, uses > > OpenGL for drawing, and aims to be a X > > reemplacement. Not that it is > > anything near that point though. > I'm not really looking for an X alternative. I was > just thinking about how to improve X over the next > five to ten years. Both the Mac and Windows Longhorn > are using new 3D enabled GUIs. This is more of a > response to these new GUIs. I think you should join the discution about XFree86 5.0 when it happens. > The goal would be to slowly transform the guts of X > into something designed for 3D hardware instead of > what we have now. This would be done such that no > existing X apps would notice the changes. Moore's law > means that everyone is going to have super 3D hardware > in a couple of years. Even Embeded or handheld systems ? And anyway, the way you do 2D and 3D in hardware is somewhat different, and most hardware has special stuff for 2D or something such. > Without starting starting to think about 3D now, what > will Linux's response to Longhorn be when it ships in > a year or two? Also, before you speak about unifying the 2D and 3D drivers you need to look at how a 3D desktop would work. Friendly, Sven Luther --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel