狠狠综合久久久久综合网址-a毛片网站-欧美啊v在线观看-中文字幕久久熟女人妻av免费-无码av一区二区三区不卡-亚洲综合av色婷婷五月蜜臀-夜夜操天天摸-a级在线免费观看-三上悠亚91-国产丰满乱子伦无码专区-视频一区中文字幕-黑人大战欲求不满人妻-精品亚洲国产成人蜜臀av-男人你懂得-97超碰人人爽-五月丁香六月综合缴情在线

代做COMP9414、代寫(xiě)C++,Java程序語(yǔ)言

時(shí)間:2024-06-20  來(lái)源:  作者: 我要糾錯(cuò)



COMP9414 24T2
Artificial Intelligence
Assignment 1 - Artificial neural networks
Due: Week 5, Wednesday, 26 June 2024, 11:55 PM.
1 Problem context
Time Series Air Quality Prediction with Neural Networks: In this
assignment, you will delve into the realm of time series prediction using neural
network architectures. You will explore both classification and estimation
tasks using a publicly available dataset.
You will be provided with a dataset named “Air Quality,” [1] available
on the UCI Machine Learning Repository 1. We tailored this dataset for this
assignment and made some modifications. Therefore, please only use the
attached dataset for this assignment.
The given dataset contains 8,358 instances of hourly averaged responses
from an array of five metal oxide chemical sensors embedded in an air qual-
ity chemical multisensor device. The device was located in the field in a
significantly polluted area at road level within an Italian city. Data were
recorded from March 2004 to February 2005 (one year), representing the
longest freely available recordings of on-field deployed air quality chemical
sensor device responses. Ground truth hourly averaged concentrations for
carbon monoxide, non-methane hydrocarbons, benzene, total nitrogen ox-
ides, and nitrogen dioxide among other variables were provided by a co-
located reference-certified analyser. The variables included in the dataset
1https://archive.ics.uci.edu/dataset/360/air+quality
1
are listed in Table 1. Missing values within the dataset are tagged
with -200 value.
Table 1: Variables within the dataset.
Variable Meaning
CO(GT) True hourly averaged concentration of carbon monoxide
PT08.S1(CO) Hourly averaged sensor response
NMHC(GT) True hourly averaged overall Non Metanic HydroCar-
bons concentration
C6H6(GT) True hourly averaged Benzene concentration
PT08.S2(NMHC) Hourly averaged sensor response
NOx(GT) True hourly averaged NOx concentration
PT08.S3(NOx) Hourly averaged sensor response
NO2(GT) True hourly averaged NO2 concentration
PT08.S4(NO2) Hourly averaged sensor response
PT08.S5(O3) Hourly averaged sensor response
T Temperature
RH Relative Humidity
AH Absolute Humidity
2 Activities
This assignment focuses on two main objectives:
? Classification Task: You should develop a neural network that can
predict whether the concentration of Carbon Monoxide (CO) exceeds
a certain threshold – the mean of CO(GT) values – based on historical
air quality data. This task involves binary classification, where your
model learns to classify instances into two categories: above or below
the threshold. To determine the threshold, you must first calculate
the mean value for CO(GT), excluding unknown data (missing values).
Then, use this threshold to predict whether the value predicted by your
network is above or below it. You are free to choose and design your
own network, and there are no limitations on its structure. However,
your network should be capable of handling missing values.
2
? Regression Task: You should develop a neural network that can pre-
dict the concentration of Nitrogen Oxides (NOx) based on other air
quality features. This task involves estimating a continuous numeri-
cal value (NOx concentration) from the input features using regression
techniques. You are free to choose and design your own network and
there is no limitation on that, however, your model should be able to
deal with missing values.
In summary, the classification task aims to divide instances into two cat-
egories (exceeding or not exceeding CO(GT) threshold), while the regression
task aims to predict a continuous numerical value (NOx concentration).
2.1 Data preprocessing
It is expected you analyse the provided data and perform any required pre-
processing. Some of the tasks during preprocessing might include the ones
shown below; however, not all of them are necessary and you should evaluate
each of them against the results obtained.
(a) Identify variation range for input and output variables.
(b) Plot each variable to observe the overall behaviour of the process.
(c) In case outliers or missing data are detected correct the data accord-
ingly.
(d) Split the data for training and testing.
2.2 Design of the neural network
You should select and design neural architectures for addressing both the
classification and regression problem described above. In each case, consider
the following steps:
(a) Design the network and decide the number of layers, units, and their
respective activation functions.
(b) Remember it’s recommended your network accomplish the maximal
number of parameters Nw < (number of samples)/10.
(c) Create the neural network using Keras and TensorFlow.
3
2.3 Training
In this section, you have to train your proposed neural network. Consider
the following steps:
(a) Decide the training parameters such as loss function, optimizer, batch
size, learning rate, and episodes.
(b) Train the neural model and verify the loss values during the process.
(c) Verify possible overfitting problems.
2.4 Validating the neural model
Assess your results plotting training results and the network response for the
test inputs against the test targets. Compute error indexes to complement
the visual analysis.
(a) For the classification task, draw two different plots to illustrate your
results over different epochs. In the first plot, show the training and
validation loss over the epochs. In the second plot, show the training
and validation accuracy over the epochs. For example, Figure 1 and
Figure 2 show loss and classification accuracy plots for 100 epochs,
respectively.
Figure 1: Loss plot for the classifica-
tion task
Figure 2: Accuracy plot for the clas-
sification task
4
(b) For the classification task, compute a confusion matrix 2 including True
Positive (TP), True Negative (TN), False Positive (FP), and False Neg-
ative (FN), as shown in Table 2. Moreover, report accuracy and pre-
cision for your test data and mention the number of tested samples as
shown in Table 3 (the numbers shown in both tables are randomly cho-
sen and may not be consistent with each other). For instance, Sklearn
library offers a various range of metric functions 3, including confusion
matrix 4, accuracy, and precision. You can use Sklearn in-built met-
ric functions to calculate the mentioned metrics or develop your own
functions.
Table 2: Confusion matrix for the test data for the classification task.
Confusion Matrix Positive (Actual) Negative (Actual)
Positive (Predicted) 103 6
Negative (Predicted) 6 75
Table 3: Accuracy and precision for the test data for the classification task.
Accuracy Precision Number of Samples
CO(GT) classification 63% 60% 190
(c) For the regression task, draw two different plots to illustrate your re-
sults. In the first plot, show how the selected loss function varies for
both the training and validation through the epochs. In the second
plot, show the final estimation results for the validation test. For in-
stance, Figure 3 and Figure 4 show the loss function and the network
outputs vs the actual NOx(GT) values for a validation test, respec-
tively. In Figure 4 no data preprocessing has been performed, however,
as mentioned above, it is expected you include this in your assignment.
(d) For the regression task, report performance indexes including the Root
Mean Squared Error (RMSE), Mean Absolute Error (MAE) (see a
discussion on [2]), and the number of samples for your estimation of
2https://en.wikipedia.org/wiki/Confusion matrix
3https://scikit-learn.org/stable/api/sklearn.metrics.html
4https://scikitlearn.org/stable/modules/generated/sklearn.metrics.confusion matrix.html
5
Figure 3: Loss plot for the re-
gression task.
Figure 4: Estimated and actual NOx(GT)
for the validation set.
NOx(GT) values in a table. Root Mean Squared Error (RMSE) mea-
sures the differences between the observed values and predicted ones
and is defined as follows:
RMSE =

1
n
Σi=ni=1 (Yi ? Y?i)2, (1)
where n is the number of our samples, Yi is the actual label and Y?i
is the predicted value. In the same way, MAE can be defined as the
absolute average of errors as follows:
MAE =
1
n
Σi=ni=1 |Yi ? Y?i|. (2)
Table 4 shows an example of the performance indexes (all numbers are
randomly chosen and may not be consistent with each other). As men-
tioned before, Sklearn library offers a various range of metric functions,
including RMSE5 and MAE 6. You can use Sklearn in-built metric func-
tions to calculate the mentioned metrics or develop your own functions.
Table 4: Result table for the test data for the regression task.
RMSE MAE Number of Samples
90.60 50.35 55
5https://scikit-learn.org/stable/modules/generated/sklearn.metrics.root mean squared error.html
6https://scikit-learn.org/stable/modules/generated/sklearn.metrics.mean absolute error.html
6
3 Testing and discussing your code
As part of the assignment evaluation, your code will be tested by tutors along
with you in a discussion session carried out in the tutorial session in week 6.
The assignment has a total of 25 marks. The discussion is mandatory and,
therefore, we will not mark any assignment not discussed with tutors.
You are expected to propose and build neural models for classification
and regression tasks. The minimal output we expect to see are the results
mentioned above in Section 2.4. You will receive marks for each of these
subsection as shown in Table 5, i.e. 7 marks in total. However, it’s fine if
you want to include any other outcome to highlight particular aspects when
testing and discussing your code with your tutor.
For marking your results, you should be prepared to simulate your neural
model with a generalisation set we have saved apart for that purpose. You
must anticipate this by including in your submission a script ready to open
a file (with the same characteristics as the given dataset but with fewer data
points), simulate the network, and perform all the validation tests described
in Section 2.4 (b) and (d) (accuracy, precision, RMSE, MAE). It is recom-
mended to save all of your hyper-parameters and weights (your model in
general) so you can call your network and perform the analysis later in your
discussion session.
As for the classification task, you need to compute accuracy and precision,
while for the regression task RMSE and MAE using the generalisation set.
You will receive 3 marks for each task, given successful results. Expected
results should be as follows:
? For the classification task, your network should achieve at least 85%
accuracy and precision. Accuracy and precision lower than that will
result in a score of 0 marks for that specific section.
? For the regression task, it is expected to achieve an RMSE of at most
280 and an MAE of 220 for unseen data points. Errors higher than the
mentioned values will be marked as 0 marks.
Finally, you will receive 1 mark for code readability for each task, and
your tutor will also give you a maximum of 5 marks for each task depending
on the level of code understanding as follows: 5. Outstanding, 4. Great,
3. Fair, 2. Low, 1. Deficient, 0. No answer.
7
Table 5: Marks for each task.
Task Marks
Results obtained with given dataset
Loss and accuracy plots for classification task 2 marks
Confusion matrix and accuracy and precision tables for classifi-
cation task
2 marks
Loss and estimated NOx(GT) plots for regression task 2 marks
Performance indexes table for regression task 1 mark
Results obtained with generalisation dataset
Accuracy and precision for classification task 3 marks
RMSE and MAE for regression task 3 marks
Code understanding and discussion
Code readability for classification task 1 mark
Code readability for regression task 1 mark
Code understanding and discussion for classification task 5 mark
Code understanding and discussion for regression task 5 mark
Total marks 25 marks
4 Submitting your assignment
The assignment must be done individually. You must submit your assignment
solution by Moodle. This will consist of a single .ipynb Jupyter file. This file
should contain all the necessary code for reading files, data preprocessing,
network architecture, and result evaluations. Additionally, your file should
include short text descriptions to help markers better understand your code.
Please be mindful that providing clean and easy-to-read code is a part of
your assignment.
Please indicate your full name and your zID at the top of the file as a
comment. You can submit as many times as you like before the deadline –
later submissions overwrite earlier ones. After submitting your file a good
practice is to take a screenshot of it for future reference.
Late submission penalty: UNSW has a standard late submission
penalty of 5% per day from your mark, capped at five days from the as-
sessment deadline, after that students cannot submit the assignment.
8
5 Deadline and questions
Deadline: Week 5, Wednesday 26 June of June 2024, 11:55pm. Please
use the forum on Moodle to ask questions related to the project. We will
prioritise questions asked in the forum. However, you should not share your
code to avoid making it public and possible plagiarism. If that’s the case,
use the course email cs9414@cse.unsw.edu.au as alternative.
Although we try to answer questions as quickly as possible, we might take
up to 1 or 2 business days to reply, therefore, last-moment questions might
not be answered timely.
6 Plagiarism policy
Your program must be entirely your own work. Plagiarism detection software
might be used to compare submissions pairwise (including submissions for
any similar projects from previous years) and serious penalties will be applied,
particularly in the case of repeat offences.
Do not copy from others. Do not allow anyone to see your code.
Please refer to the UNSW Policy on Academic Honesty and Plagiarism if you
require further clarification on this matter.
References
[1] De Vito, S., Massera, E., Piga, M., Martinotto, L. and Di Francia, G.,
2008. On field calibration of an electronic nose for benzene estimation in an
urban pollution monitoring scenario. Sensors and Actuators B: Chemical,
129(2), pp.750-757.
[2] Hodson, T. O. 2022. Root mean square error (RMSE) or mean absolute
error (MAE): When to use them or not. Geoscientific Model Development
Discussions, 2022, 1-10.

請(qǐng)加QQ:99515681  郵箱:99515681@qq.com   WX:codinghelp













 

標(biāo)簽:

掃一掃在手機(jī)打開(kāi)當(dāng)前頁(yè)
  • 上一篇:代寫(xiě)指標(biāo)編寫(xiě) 編寫(xiě)同花順指標(biāo)公式 代編公式
  • 下一篇:ECON2101代做、代寫(xiě)Python/c++設(shè)計(jì)編程
  • CMT219代寫(xiě)、代做Java程序語(yǔ)言
  • 代做MATH1033、代寫(xiě)c/c++,Java程序語(yǔ)言
  • 代做CSCI 2525、c/c++,Java程序語(yǔ)言代寫(xiě)
  • COMP 315代寫(xiě)、Java程序語(yǔ)言代做
  • 昆明生活資訊

    昆明圖文信息
    蝴蝶泉(4A)-大理旅游
    蝴蝶泉(4A)-大理旅游
    油炸竹蟲(chóng)
    油炸竹蟲(chóng)
    酸筍煮魚(yú)(雞)
    酸筍煮魚(yú)(雞)
    竹筒飯
    竹筒飯
    香茅草烤魚(yú)
    香茅草烤魚(yú)
    檸檬烤魚(yú)
    檸檬烤魚(yú)
    昆明西山國(guó)家級(jí)風(fēng)景名勝區(qū)
    昆明西山國(guó)家級(jí)風(fēng)景名勝區(qū)
    昆明旅游索道攻略
    昆明旅游索道攻略
  • NBA直播 短信驗(yàn)證碼平臺(tái) 幣安官網(wǎng)下載 歐冠直播 WPS下載

    關(guān)于我們 | 打賞支持 | 廣告服務(wù) | 聯(lián)系我們 | 網(wǎng)站地圖 | 免責(zé)聲明 | 幫助中心 | 友情鏈接 |

    Copyright © 2025 kmw.cc Inc. All Rights Reserved. 昆明網(wǎng) 版權(quán)所有
    ICP備06013414號(hào)-3 公安備 42010502001045

    狠狠综合久久久久综合网址-a毛片网站-欧美啊v在线观看-中文字幕久久熟女人妻av免费-无码av一区二区三区不卡-亚洲综合av色婷婷五月蜜臀-夜夜操天天摸-a级在线免费观看-三上悠亚91-国产丰满乱子伦无码专区-视频一区中文字幕-黑人大战欲求不满人妻-精品亚洲国产成人蜜臀av-男人你懂得-97超碰人人爽-五月丁香六月综合缴情在线
  • <dl id="akume"></dl>
  • <noscript id="akume"><object id="akume"></object></noscript>
  • <nav id="akume"><dl id="akume"></dl></nav>
  • <rt id="akume"></rt>
    <dl id="akume"><acronym id="akume"></acronym></dl><dl id="akume"><xmp id="akume"></xmp></dl>
    久无码久无码av无码| 熟女少妇在线视频播放| 日av中文字幕| www黄色日本| 国产成人艳妇aa视频在线| 色一情一区二区三区| 欧美在线观看视频网站| 四虎永久在线精品无码视频| 日韩在线视频在线观看| 国产1区2区在线| 日本不卡在线观看视频| 亚洲精品无码久久久久久| 91猫先生在线| 日韩欧美黄色大片| 8x8x成人免费视频| 99999精品| 佐佐木明希av| 黄页免费在线观看视频| 国产aaa一级片| 在线观看免费视频高清游戏推荐| 视色视频在线观看| 99精品一级欧美片免费播放| 成人国产一区二区三区| 日韩精品一区在线视频| 能在线观看的av网站| 国产日韩欧美久久| 中文字幕日韩精品无码内射| av免费观看大全| 手机在线看福利| 国产av第一区| 国产精品69页| 亚洲成人动漫在线| 免费黄色日本网站| 中文字幕第66页| 欧美黑人经典片免费观看| 男女爽爽爽视频| 欧美性猛交内射兽交老熟妇| 国产超级av在线| 日本福利视频导航| 久章草在线视频| 日韩在线视频在线| 亚洲免费黄色录像| 日本少妇高潮喷水视频| 久久av秘一区二区三区| 国产日韩成人内射视频 | 香蕉视频999| 欧美亚洲一二三区| 咪咪色在线视频| 国产成人精品视频ⅴa片软件竹菊| 91女神在线观看| 久久久久狠狠高潮亚洲精品| 国产精品啪啪啪视频| 国产理论在线播放| 久久综合色视频| www.在线观看av| 四虎影院一区二区| 99国产精品久久久久久| 538在线视频观看| 男女av免费观看| 国产资源在线视频| a级黄色小视频| 免费网站永久免费观看| 日日噜噜夜夜狠狠久久丁香五月| 老熟妇仑乱视频一区二区| 久久久999免费视频| 国产av熟女一区二区三区| www.偷拍.com| 中文字幕av久久| 做爰高潮hd色即是空| 国产资源中文字幕| 婷婷视频在线播放| 欧美一级免费在线观看| 99re99热| 国产一区 在线播放| 今天免费高清在线观看国语| 亚洲欧美日韩不卡| 国产一级大片免费看| 青草网在线观看| 无码人妻精品一区二区三区在线| 精品少妇人妻av免费久久洗澡| 鲁一鲁一鲁一鲁一色| 国产第一页视频| 国模私拍视频在线观看| 色黄视频免费看| 免费网站永久免费观看| 欧洲黄色一级视频| 日韩在线xxx| 91热视频在线观看| 天天做天天躁天天躁| 午夜精品久久久久久久无码| 狠狠爱免费视频| 日韩 国产 一区| 又大又硬又爽免费视频| 久久久久狠狠高潮亚洲精品| 手机版av在线| 国产亚洲黄色片| 波多野结衣天堂| 国内精品国产三级国产99| 男人靠女人免费视频网站| 嫩草视频免费在线观看| 久久亚洲国产成人精品无码区| 成人毛片一区二区| 57pao国产成永久免费视频| 国产视频在线观看网站| 青青青在线视频免费观看| 大地资源网在线观看免费官网| 日韩国产欧美亚洲| 永久免费黄色片| 岳毛多又紧做起爽| 成人小视频在线观看免费| 亚洲精品午夜在线观看| 国产69精品久久久久久久| 一级黄色片在线免费观看| 黄色免费视频大全| 亚洲精品少妇一区二区| 日本人69视频| 久久精品99国产| 水蜜桃色314在线观看| 色姑娘综合天天| 杨幂毛片午夜性生毛片 | 国产精品v日韩精品v在线观看| 加勒比成人在线| 黄色网zhan| 99精品视频网站| 久久久精品高清| 国产一级片自拍| www.xxx亚洲| 欧美黄色一级片视频| 国产精品又粗又长| 一本久道高清无码视频| 国产青草视频在线观看| 日韩精品一区二区三区电影| 在线免费看v片| 午夜影院免费观看视频| 三级性生活视频| 亚洲综合123| 亚洲AV无码成人精品一区| 两性午夜免费视频| 亚洲av无日韩毛片久久| 亚洲制服中文字幕| 天天综合中文字幕| 人人妻人人澡人人爽精品欧美一区 | 国产精品无码av无码| 欧美激情成人网| 天天爽人人爽夜夜爽| 无尽裸体动漫2d在线观看| 日本高清久久久| 激情成人在线观看| 日本福利视频网站| 浮妇高潮喷白浆视频| 88av.com| 一区二区三区国产好的精华液| 在线无限看免费粉色视频| 亚洲五码在线观看视频| 激情深爱综合网| 午夜精品在线免费观看| 成人黄色一级大片| 国产乱子伦精品视频| 免费看国产曰批40分钟| 亚洲老女人av| 热久久最新地址| 日本精品久久久久中文字幕| 视色视频在线观看| 免费人成自慰网站| 欧美一级特黄a| 国产精品无码电影在线观看| 激情综合网婷婷| 欧美 亚洲 视频| 9久久婷婷国产综合精品性色| 黑人巨大国产9丨视频| 欧美精品99久久| 好吊色这里只有精品| 久久九九国产视频| 视色,视色影院,视色影库,视色网| 成年人网站免费视频| 在线视频一二区| 538在线视频观看| 日韩精品一区二区在线视频| 国产九九热视频| 欧美日本视频在线观看| 青青草免费在线视频观看| 久久久国产欧美| 成年人观看网站| 久艹在线免费观看| 亚洲国产精品影视| 欧美在线aaa| 婷婷丁香激情网| 亚洲午夜精品久久久久久人妖| 一级一片免费播放| 网站在线你懂的| www.亚洲高清| 尤物国产在线观看| 久久久久久久激情| 青青草国产精品视频| www插插插无码免费视频网站| 国产女同无遮挡互慰高潮91| 九九热99视频| www.99av.com| 麻豆一区二区三区视频| 国产嫩草在线观看|