Search results

Filters

  • Journals
  • Authors
  • Keywords
  • Date
  • Type

Search results

Number of results: 9
items per page: 25 50 75
Sort by:
Download PDF Download RIS Download Bibtex

Abstract

A liquid crystal display (LCD) recycling process is needed to increase its efficiency by recovering the resources in addition to metals and plastics. This study investigates the pre-treatment process for recycling LCD glass. Recycling pre-treatment includes dismantling the LCD from the waste product, crushing the glass, and separating the glass particles from the impurities. Scanning electron microscopy confirmed that the oscillation milling process is more effective in maintaining uniform powder shape and size as compared to the cut milling process. The glass particles crushed by the oscillating mill, optimized at 1500 rpm, had a uniformly distributed particle size of less than 10 µm. These small particles were separated from the organic impurities, achieving a 98% pure powder that can be used as recycled raw materials. The proposed pre-treatment process for recycling LCD glass will enhance the ability to use waste glass as a valuable resource in the manufacturing of future displays.

Go to article

Authors and Affiliations

Seyul Kim
Yubin Kang
Leeseung Kang
Hyun Seon Hong
Chan Gi Lee
Download PDF Download RIS Download Bibtex

Abstract

Eye tracking systems are mostly video-based methods which require significant computation to achieve good accuracy. An alternative method with comparable accuracy but less computational expense is 2D microelectromechanical (MEMS) mirror scanning. However, this technology is relatively new and there are not many publications on it. The purpose of this study was to examine how individual parameters of system components can affect the accuracy of pupil position estimation. The study was conducted based on a virtual simulator. It was shown that the optimal detector field of view (FOV) depends on the frequency ratio of the MEMS mirror axis. For a value of 1:13, the smallest errors were at 0.°, 1.65°, 2.3°, and 2.95°. The error for the impact of the signal sampling rate above 3 kHz stabilizes at 0.065° and no longer changes its value regardless of increasing the number of samples. The error for the frequency ratio of the MEMS mirror axis increases linearly in the range of 0.065°–0.1°up to the ratio of 1:230. Above this there is a sudden increase to the average value of 0.3°. The conducted research provides guidance in the selection of parameters for the construction of eye tracking MEMS mirror-based systems.
Go to article

Bibliography

[1] Duchowski, A. T., (2017). Eye tracking methodology: Theory and practice. Springer. https://doi.org/10.1007/978-3-319-57883-5
[2] Judd, T., Ehinger, K., Durand, F., & Torralba, A. (2009, September). Learning to predict where humans look. IEEE 12th International Conference on Computer Vision (pp. 2106–2113). IEEE. https://doi.org/10.1109/ICCV.2009.5459462
[3] Goldberg, J. H., & Kotval, X. P. (1999). Computer interface evaluation using eye movements: methods and constructs. International Journal of Industrial Ergonomics, 24(6), 631–645. https://doi.org/10.1016/S0169-8141(98)00068-7
[4] Hansen, D. W., & Ji, Q. (2009). In the eye of the beholder: A survey of models for eyes and gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(3), 478–500. https://doi.org/10.1109/TPAMI.2009.30
[5] Carvalho, N., Laurent, E., Noiret, N., Chopard, G., Haffen, E., Bennabi, D., & Vandel, P. (2015). Eye movement in unipolar and bipolar depression: A systematic review of the literature. Frontiers in Psychology, 6, 1809. https://doi.org/10.3389/fpsyg.2015.01809
[6] Bittencourt, J., Velasques, B., Teixeira, S., Basile, L. F., Salles, J. I., Nardi, A. E., Budde, H., Cagy, M., Piedade, R., & Ribeiro, P. (2013). Saccadic eye movement applications for psychiatric disorders. Neuropsychiatric Disease and Treatment, 9, 1393. https://doi.org/10.2147/NDT.S45931
[7] Duchowski, A. T., Medlin, E., Gramopadhye, A., Melloy, B., & Nair, S. (2001, November). Binocular eye tracking in VR for visual inspection training. Proceedings of the ACM symposium on Virtual reality software and technology (pp. 1–8). https://doi.org/10.1145/505008.505010
[8] Blattgerste, J., Renner, P., & Pfeiffer, T. (2018, June). Advantages of eye-gaze over head-gaze-based selection in virtual and augmented reality under varying field of views. Proceedings of the Workshop on Communication by Gaze Interaction (pp. 1–9). https://doi.org/10.1145/3206343.3206349
[9] Pasarica, A., Bozomitu, R. G., Cehan, V., Lupu, R. G., & Rotariu, C. (2015, October). Pupil detection algorithms for eye tracking applications. 2015 IEEE 21st International Symposium for Design and Technology in Electronic Packaging (SIITME) (pp. 161–164). IEEE. https://doi.org/10.1109/SIITME.2015.7342317 [10] Stengel, M., Grogorick, S., Eisemann, M., Eisemann, E., & Magnor, M. A. (2015, October). An affordable solution for binocular eye tracking and calibration in head-mounted displays. Proceedings of the 23rd ACM international conference on Multimedia (pp. 15–24). https://doi.org/10.1145/2733373.2806265
[11] Wen, Q., Bradley, D., Beeler, T., Park, S., Hilliges, O.,Yong, J.,&Xu, F. (2020).Accurate Real-time 3D Gaze Tracking Using a Lightweight Eyeball Calibration. Computer Graphics Forum, 39(2), 475–485. https://doi.org/10.1111/cgf.13945
[12] Lee, G. J., Jang, S. W., & Kim, G. Y. (2020). Pupil detection and gaze tracking using a deformable template. Multimedia Tools and Applications, 79(19), 12939–12958. https://doi.org/10.1007/ s11042-020-08638-7
[13] Gegenfurtner, A., Lehtinen, E., & Säljö, R. (2011). Expertise differences in the comprehension of visualizations: A meta-analysis of eye-tracking research in professional domains. Educational Psychology Review, 23(4), 523–552. https://doi.org/10.1007/s10648-011-9174-7
[14] Sarkar, N., O’Hanlon, B., Rohani, A., Strathearn, D., Lee, G., Olfat, M., & Mansour, R. R. (2017, January). A resonant eye-tracking microsystem for velocity estimation of saccades and foveated rendering. IEEE 30th International Conference on Micro Electro Mechanical Systems (MEMS) (pp. 304–307). IEEE. https://doi.org/10.1109/MEMSYS.2017.7863402
[15] Bartuzel, M. M., Wróbel, K., Tamborski, S., Meina, M., Nowakowski, M., Dalasinski, K., Szkulmowska, A. & Szkulmowski, M. (2020). High-resolution, ultrafast, wide-field retinal eye-tracking for enhanced quantification of fixational and saccadic motion. Biomedical Optics Express, 11(6), 3164–3180. https://doi.org/10.1364/BOE.392849
[16] Meyer, J., Schlebusch, T., Fuhl, W., & Kasneci, E. (2020). A novel camera-free eye tracking sensor for augmented reality based on laser scanning. IEEE Sensors Journal, 20(24), 15204–15212. https://doi.org/10.1109/JSEN.2020.3011985
[17] Pomianek, M., Piszczek, M., Maciejewski, M., & Krukowski, P. (2020, October). Pupil Position Estimation Error in an Eye Tracking System Based on the MEMS Mirror Scanning Method. Proceedings of the 3rd International Conference on Microelectronic Devices and Technologies (MicDAT’ 2020) (pp. 28–30). IFSA.
[18] Pengfei, Y., Zhengming, C., Jing, T., & Lina, Q. (2016). Virtual Simulation System of Cutter Suction Dredger Based on Unity3D. Journal of Systems Simulation, 28(9), 2069–2075.
[19] Richards, D., & Taylor, M. (2015). A Comparison of learning gains when using a 2D simulation tool versus a 3D virtual world: An experiment to find the right representation involving the Marginal Value Theorem. Computers & Education, 86, 157–171. https://doi.org/10.1016/j.compedu.2015.03.009
[20] Müller, L. M., Mandon, K., Gliesche, P., Weiß, S., & Heuten, W. (2020, November). Visualization of Eye Tracking Data in Unity3D. 19th International Conference on Mobile and Ubiquitous Multimedia (pp. 343–344). https://doi.org/10.1145/3428361.3431194
Go to article

Authors and Affiliations

Mateusz Pomianek
1
Marek Piszczek
1
Marcin Maciejewski
1

  1. Military University of Technology, Institute of Optoelectronics, 2 Kaliskiego St., 00-908 Warsaw, Poland
Download PDF Download RIS Download Bibtex

Abstract

The paper offers a reappraisal of the Puławy collection’s display through a detailed analysis of Virgilian evocations within the complex. The choice of inscriptions and ancient imagery framing the exposition’s narrative, as well as the surviving reception testimonies towards such strategies within Pulavian pavilions, demonstrate an ongoing questioning of chronological sequences, the primacy of authenticity, and aestheticising exhibits. Such anachronic distancing from a historicizing temporality would take place in favour of an intimate experience of familial-cum-national memorabilia, in accordance with the contemporaneously emerging category of the fetish.
Go to article

Authors and Affiliations

Aleksander Musiał
1
ORCID: ORCID

  1. Princeton University
Download PDF Download RIS Download Bibtex

Abstract

This paper presents a relationship between Auditory Display (AD) and the domains of music and acoustics. First, some basic notions of the Auditory Display area are shortly outlined. Then, the research trends and system solutions within the fields of music technology, music information retrieval and music recommendation and acoustics that are within the scope of AD are discussed. Finally, an example of AD solution based on gaze tracking that may facilitate music annotation process is shown. The paper concludes with a few remarks about directions for further research in the domains discussed.

Go to article

Authors and Affiliations

Bożena Kostek
Download PDF Download RIS Download Bibtex

Abstract

Sonification is defined as presentation of information by means of non-speech audio. In assistive technologies for the blind, sonification is most often used in electronic travel aids (ETAs) - devices which aid in independent mobility through obstacle detection or help in orientation and navigation. The presented review contains an authored classification of various sonification schemes implemented in the most widely known ETAs. The review covers both those commercially available and those in various stages of research, according to the input used, level of signal processing algorithm used and sonification methods. Additionally, a sonification approach developed in the Naviton project is presented. The prototype utilizes stereovision scene reconstruction, obstacle and surface segmentation and spatial HRTF filtered audio with discrete musical sounds and was successfully tested in a pilot study with blind volunteers in a controlled environment, allowing to localize and navigate around obstacles.
Go to article

Authors and Affiliations

Michał Bujacz
Paweł Strumiłło
Download PDF Download RIS Download Bibtex

Abstract

The display of affection in romantic relationships and its concomitants still require more scientific attention. Despite some studies addressing the topic of affection display, the literature does not provide a psychometrically reliable self-descriptive tool to measure this construct. Therefore, we conducted three studies among Polish adults to develop and validate a psychological tool for comprehensively identifying and measuring the display of emotional affection. Study 1 ( N = 894) aimed to develop and validate the Public and Private Romantic Display of Affection Scale (PPRDAS). It proved to be a valid psychological scale, as the theoretically assumed structure was supported by the results of the empirical analysis. Study 2 ( N = 343) confirmed the convergence validity of the PPRDAS using items of emotional expression from the Dyadic Adjustment Scale (Spanier, 1989). In Study 3 ( N = 204 couples), we further verified the external validity of the PPRDAS using an assessment of affection displayed by one's partner in the relationship. Individuals’ self-estimates of their private and public displays of affection were confirmed by their romantic partners. In all studies, display of feelings was positively correlated with sexual and relationship satisfaction. Negative correlations with age and the duration of the romantic relationship were also observed.
Go to article

Authors and Affiliations

Dagna Joanna Kocur
1
ORCID: ORCID
Monika Prusik
2
ORCID: ORCID
Karolina Konopka
3

  1. University of Silesia, Katowice, Poland
  2. The University of Warsaw, Warsaw, Poland
  3. The Maria Grzegorzewska University, Warsaw, Poland
Download PDF Download RIS Download Bibtex

Abstract

For the Gdansk Library, participation in the Nights of Museums is an effective way of execution of its didactic and science-promoting goals defined in its mission. The interesting main themes, presentation of the library’s rich collections, as well as the involvement of the organisers and the participating staff members all translated into the success of the nine editions of the event between 2011 and 2019. Their subsequent main themes were: A night with Johannes Hevelius from the treasury of the Gdansk Library, Fashion for books – fashion in books, Horror in the library, The world of the library on one night, Taking a book to further than the horizon, The Library as a Garden, Gedanum domus nostra – Our home – Gdansk, In mari vita tua – In the sea is your life, Genius – on the 500th death anniversary of Leonardo da Vinci. During the past Night of Museums, visitors to the Library were most impressed by the displays in the Reading Room of Historical Collections. Such presentations are sometimes the only opportunity to have a close contact with valuable manuscripts, old printed books, prints, and other special collections, and to hear competent staff members talking about them. Regular attractions of the Night of Museums included displays referring to the theme of the event in the exhibition room and a sale of library publications in the hall of the historical building at ul. Wałowa 15. The subsequent organisers of the project, also in cooperation with other cultural institutions from the Tri-City, each time enriched the programme with fascinating talks, thematic workshops, and even concerts. During the subsequent editions of the Night of Museums, the Gdansk Library hosted an average of 670 visitors, which testifies to the value of this tool of promotion of the library.
Go to article

Authors and Affiliations

Joanna Śliwa
1

  1. PAN Biblioteka Gdańska, Dział Nowej Książki
Download PDF Download RIS Download Bibtex

Abstract

In this paper we propose a method which allows to overcome the basic functional problems in holographic displays with naked eye observation caused by delivering too small images visible in narrow viewing angles. The solution is based on combining the spatiotemporal multiplexing method with a 4f optical system. It enables to increase an aperture of a holographic display and extend the angular visual field of view. The applicability of the modified display is evidenced by Wigner distribution analysis of holographic imaging with spatiotemporal multiplexing method and by the experiments performed at the display demonstrator.

Go to article

Authors and Affiliations

G. Finke
M. Kujawińska
T. Kozacki
W. Zaperty
Download PDF Download RIS Download Bibtex

Abstract

This paper presents the results of Pilot Assisting Module research performed on two light aircraft flight simulators developed in parallel at Brno University of Technology, Czech Republic, and Rzeszow University of Technology, Poland. The first simulator was designed as an open platform for the verification and validation of the advanced pilot/aircraft interface systems and inherited its appearance from the cockpit section of the Evektor SportStar. The second flight simulator, the XM-15, has been built around the cockpit of a unique agriculture jet Belfegor. It introduced a system architecture that supports scientific simulations of various aircraft types and configurations, making it suitable for conceptual testing of Pilot Assisting Module. The XM-15 was initially designed to support research on advanced flight control systems, but due to its continuing modernization it evolved into a hardware-in-the-loop test-bed for electromechanical actuators and autopilot CAN based controller blocks. Pilot-in-the-loop experiments of proposed Pilot Assisting Module revealed favorable operational scenarios, under which the proposed system reduces the cockpit workload during single pilot operations.

Go to article

Authors and Affiliations

Peter Chudy
Pawel Rzucidlo

This page uses 'cookies'. Learn more