Main Article Content

Abstract

Indonesia is a center of convection and acts as a driving force for global atmospheric circulation due to its geographical position. Moreover, Kototabang Hill is one of the national strategic areas in the equatorial atmospheric observation room with limited cloud cover data so that tools and development are needed to meet these data needs. Sky Camera for the purpose of observing clouds (Cloud Camera) is urgently needed to complement the need for cloud cover data to support observation and research activities in the field of the atmosphere. The Cloud Camera design is done by modifying the CCD Camera with several supporting devices including fish eye, solar tracker, sun filter and dome. Evaluation of the urgency of these enhancements is discussed in this paper. Among the four combinations of using supporting instruments (dome and sun filter) for the Cloud Camera device, the best image obtained is the device that uses a sun filter and without a dome. Among the four combinations of using supporting instruments (dome and sun filter) for the Cloud Camera device, the best image obtained is the device that uses a sun filter and without a dome.

Keywords

Sky camera, fish eye, dome, sun filter

Article Details

How to Cite
1.
Syafrijon S, Fahmi Rahmatia, Ridho Pratama, Teguh Nugraha Pratama, Ednofri, Muzirwan, Ismail AMM binti I. Design of a Sky Camera-Based Cloud Monitoring Camera at the Agam Space and Atmospheric Observation Station, Bukit Kototabang . EKSAKTA [Internet]. 2023Sep.30 [cited 2024Nov.23];23(03):362-7. Available from: https://eksakta.ppj.unp.ac.id/index.php/eksakta/article/view/426

References

  1. O. El Alani, M. Abraim, H. Ghennioui, A. Ghennioui, I. Ikenbi, and F. E. Dahr. (2021). Short term solar irradiance forecasting using sky images based on a hybrid CNN–MLP model, Energy Reports, vol. 7, no. May, pp. 888–900.
  2. Caldas, M., & Alonso-Suárez, R. (2019). Very short-term solar irradiance forecast using all-sky imaging and real-time irradiance measurements. Renewable energy, 143, 1643-1658.
  3. Z. Kolláth, A. Cool, A. Jechow, K. Kolláth, D. Száz, and K. P. Tong. (2020). Introducing the dark sky unit for multi-spectral measurement of the night sky quality with commercial digital cameras, J. Quant. Spectrosc. Radiat. Transf., vol. 253.
  4. Jechow, A., Kyba, C. C., & Hölker, F. (2019). Beyond all-sky: assessing ecological light pollution using multi-spectral full-sphere fisheye lens imaging. Journal of Imaging, 5(4), 46.
  5. L. W. Hung, S. J. Anderson, A. Pipkin, and K. Fristrup. (2021). Changes in night sky brightness after a countywide LED retrofit, J. Environ. Manage., vol. 292, no. May, p. 112776.
  6. Robles, J., Zamorano, J., Pascual, S., Sánchez de Miguel, A., Gallego, J., & Gaston, K. J. (2021). Evolution of brightness and color of the night sky in Madrid. Remote Sensing, 13(8), 1511.
  7. Hung, L., Anderson, S., Pipkin, A., Meadows, B., & Fristrup, K. (2020, January). Changes in night sky brightness after a countywide LED lighting retrofit. In American Astronomical Society Meeting Abstracts# 235 (Vol. 235, pp. 378-02).
  8. A. Saveliev, V. Izhboldina, M. Letenkov, E. Aksamentov, and I. Vatamaniuk. (2020). Method for automated generation of road accident scene sketch based on data from mobile device camera, Transp. Res. Procedia, vol. 50, no. 2019, pp. 608–613.
  9. Thompson, A. P., Swiler, L. P., Trott, C. R., Foiles, S. M., & Tucker, G. J. (2015). Spectral neighbor analysis method for automated generation of quantum-accurate interatomic potentials. Journal of Computational Physics, 285, 316-330.
  10. Pintore, G., Garro, V., Ganovelli, F., Gobbetti, E., & Agus, M. (2016). Omnidirectional image capture on mobile devices for fast automatic generation of 2.5 D indoor maps. In 2016 IEEE winter conference on applications of computer vision (WACV) (pp. 1-9). IEEE.
  11. G. Terrén-Serrano, A. Bashir, T. Estrada, and M. Martínez-Ramón. (2021). Girasol, a sky imaging and global solar irradiance dataset, Data Br., vol. 35, pp. 1–12.
  12. Z. Xu et al., (2022). Pavement crack detection from CCD images with a locally enhanced transformer network, Int. J. Appl. Earth Obs. Geoinf., vol. 110, no. May, p. 102825.
  13. V. E. Savanevych et al., (2022). CoLiTecVS software for the automated reduction of photometric observations in CCD-frames, Astron. Comput., vol. 40, p. 100605.
  14. K. F. Cho, N. Javier, and K. Choi. (2022). BRET measurement on CCD camera-based microtiter plate readers, SLAS Discov., vol. 27, no. 7, pp. 413–417.
  15. P. S. Minz and C. S. Saini. (2021). RGB camera-based image technique for color measurement of flavored milk, Meas. Food, vol. 4, no. July, p. 100012.
  16. A. Gkiolmas, C. Dimakos, A. Chalkidis, and A. Stoumpa. (2020). An environmental education project that measures particulate matter via an Arduino interface, Sustain. Futur., vol. 2, no. February, p. 100027.
  17. W. Vallejo, C. Diaz-Uribe, and C. Fajardo. (2020). Do-it-yourself methodology for calorimeter construction based in Arduino data acquisition device for introductory chemical laboratories, Heliyon, vol. 6, no. 3, p. e03591.
  18. J. S. Han et al. (2021). Decoupled error dynamics design for discrete-time sliding mode control in industrial servo systems under control input saturation and disturbance, Mechatronics, vol. 77, p. 102581.
  19. N. La Forgia, E. H. Herø, and H. A. Jakobsen. (2021). High-speed image processing of fluid particle breakage in turbulent flow, Chem. Eng. Sci. X, vol. 12, p. 100117.
  20. K. Jaskolka, J. Seiler, F. Beyer, and A. Kaup. (2019). A Python-based laboratory course for image and video signal processing on embedded systems, Heliyon, vol. 5, no. 10.
  21. W. Chen, W. Wang, K. Wang, Z. Li, H. Li, and S. Liu. (2020). Lane departure warning systems and lane line detection methods based on image processing and semantic segmentation: A review, J. Traffic Transp. Eng.., vol. 7, no. 6, pp. 748–774.
  22. A. Nasirahmadi et al. (2019). Automatic scoring of lateral and sternal lying posture in grouped pigs using image processing and Support Vector Machine, Comput. Electron. Agric., vol. 156, no. September 2018, pp. 475–481
  23. H. Safari, B. J. Balcom, and A. Afrough. (2021). Characterization of pore and grain size distributions in porous geological samples – An image processing workflow, Comput. Geosci., vol. 156, no. March, p. 104895.
  24. O. Debauche, S. Mahmoudi, S. A. Mahmoudi, P. Manneback, J. Bindelle, and F. Lebeau. (2019). Edge computing and artificial intelligence for real-time poultry monitoring, Procedia Comput. Sci., vol. 175.
  25. F. Ortin and J. Escalada. (2021). Cnerator: A Python application for the controlled stochastic generation of standard C source code, SoftwareX, vol. 15, p. 100711.
  26. J. R. Terven and D. M. Córdova-Esparza. (2021). KinZ an Azure Kinect toolkit for Python and Matlab, Sci. Comput. Program., vol. 211, p. 102702.
  27. S. Xu and M. Zhang. (2022).Research on Credit Risk Assessment of Listed Companies in Technology Sector Based on Support Vector Machine Integration, Procedia Comput. Sci., vol. 214, no. C, pp. 867–874.
  28. N. Naval and J. M. Yusta. (2022). Comparative assessment of different solar tracking systems in the optimal management of PV-operated pumping stations, Renew. Energy, vol. 200, no. August, pp. 931–941.
  29. B. Stanek, D. Węcel, Ł. Bartela, and S. Rulik. (2022). Solar tracker error impact on linear absorbers efficiency in parabolic trough collector – Optical and thermodynamic study, Renew. Energy, vol. 196, pp. 598–609.
  30. C. Wang et al. (2022). Inversion of the refractive index of marine spilled oil using multi-angle sun glitter images acquired by the ASTER sensor, Remote Sens. Environ., vol. 275, no. March p. 113019.
  31. A. Rahdan, H. Bolandi, And M. Abedi. (2020). Design of on-board calibration methods for a digital sun sensor based on Levenberg–Marquardt algorithm and Kalman filters, Chinese J. Aeronaut., vol. 33, no. 1, pp. 339–351.
  32. S. Jiang, G. Shi, J. Cole-Dai, C. An, and B. Sun. (2021). Occurrence, latitudinal gradient and potential sources of perchlorate in the atmosphere across the hemispheres (31°N to 80°S), Environ. Int., vol. 156, p. 106611.
  33. E. Baumann et al. (2021). A backward-mode optical-resolution photoacoustic microscope for 3D imaging using a planar Fabry-Pérot sensor, Photoacoustics, vol. 24, no. March, p. 100293.
  34. H. Ishiwata, N. Hasegawa, Y. Yonemaru, and K. Abe. (2022). A proposal of depth of focus equation for an optical system combined a digital image sensor, Results Opt., vol. 6, p. 100182.
  35. Cruz, Y. J., Rivas, M., Quiza, R., Beruvides, G., & Haber, R. E. (2020). Computer vision system for welding inspection of liquefied petroleum gas pressure vessels based on combined digital image processing and deep learning techniques. Sensors, 20(16), 4505.
  36. X. Liu, D. Zhang, J. Liu, L. Wang, and S. Li. (2021). RGB Color Model Analysis for a Simple Structured Polydimethylsiloxane Pneumatic Micromixer, SLAS Technol., vol. 26, no. 5, pp. 510–518.
  37. Sarrafzadeh, M. H., La, H. J., Lee, J. Y., Cho, D. H., Shin, S. Y., Kim, W. J., & Oh, H. M. (2015). Microalgae biomass quantification by digital image processing and RGB color analysis. Journal of applied phycology, 27, 205-209.
  38. L. Huang, B. Zhang, Z. Guo, Y. Xiao, Z. Cao, and J. Yuan. (2021). Survey on depth and RGB image-based 3D hand shape and pose estimation, Virtual Real. Intell. Hardw., vol. 3, no. 3, pp. 207–234.
  39. P. S. Minz and C. S. Saini. (2021). RGB camera-based image technique for color measurement of flavored milk, Meas. Food, vol. 4, no. October, p. 100012.
  40. Nakano, K., Hirofuji, R., Ohnishi, T., Hauta-Kasari, M., Nishidate, I., & Haneishi, H. (2019). RGB camera-based imaging of oxygen saturation and hemoglobin concentration in ocular fundus. IEEE Access, 7, 56469-56479.