Wikimedia Commons API search images by (latitude, longitude) - geolocation

I would like to retrieve images from Wikimedia Commons to display on a map. This means that given a pair (latitude, longitude) I would like to find pictures around this point.
After a day of searching and trying I have still no idea whether this is possible or not. In particular I have read MediaWiki API Main page, the API reference and some examples.
So my question is: is it possible to retrieve pictures with a pair of geographical coordinates? If yes, how?

Yeah, that's possible. On Commons, Extension:GeoData is installed. Use action=query&list=geosearch&gscoord=lat|lon&gsradius=meters&gsnamespace=6&gsprimary=all
Excerpt from the API documentation
gscoord - Coordinate around which to search: two floating-point values separated by pipe (|)
gspage - Title of page around which to search
gsradius - Search radius in meters
This parameter is required
The value must be between 10 and 10000
gsmaxdim - Restrict search to objects no larger than this, in meters
gslimit - Maximum number of pages to return
No more than 500 (5000 for bots) allowed
Default: 10
gsnamespace - Namespace(s) to search
Values (separate with '|'): 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 100, 101, 102, 103, 104, 105, 106, 107, 460,
461, 490, 491, 1198, 1199, 828, 829
Maximum number of values 50 (500 for bots)
Default: 0
gsprop - What additional coordinate properties to return
Values (separate with '|'): type, name, dim, country, region, globe
Default: globe
File namespace is NS 6 in MediaWiki by default.
Example:
https://commons.wikimedia.org/w/api.php?format=jsonfm&action=query&list=geosearch&gsprimary=all&gsnamespace=6&gsradius=500&gscoord=51.5|11.95
Result:
{
"query": {
"geosearch": [
{
"pageid": 28971703,
"ns": 6,
"title": "File:RiveuferHerbst.JPG",
"lat": 51.501042,
"lon": 11.948794,
"dist": 142.8
},
{
"pageid": 32760810,
"ns": 6,
"title": "File:Pei\u00dfnitznordspitze4.JPG",
"lat": 51.499675,
"lon": 11.947992,
"dist": 143.6
}
]
}
}
If you additionally want to optain thumbnail urls with your API request, use list=geosearch as a generator:
Example:
https://commons.wikimedia.org/w/api.php?format=jsonfm&action=query&generator=geosearch&ggsprimary=all&ggsnamespace=6&ggsradius=500&ggscoord=51.5|11.95&ggslimit=1&prop=imageinfo&iilimit=1&iiprop=url&iiurlwidth=200&iiurlheight=200
Result:
{
"query": {
"pages": {
"28971703": {
"pageid": 28971703,
"ns": 6,
"title": "File:RiveuferHerbst.JPG",
"imagerepository": "local",
"imageinfo": [
{
"thumburl": "https://upload.wikimedia.org/wikipedia/commons/thumb/b/b2/RiveuferHerbst.JPG/200px-RiveuferHerbst.JPG",
"thumbwidth": 200,
"thumbheight": 150,
"url": "https://upload.wikimedia.org/wikipedia/commons/b/b2/RiveuferHerbst.JPG",
"descriptionurl": "https://commons.wikimedia.org/wiki/File:RiveuferHerbst.JPG"
}
]
}
}
}
}

Related

datadog monitering - create a metric that always returns a fixed value

In our application we are using datadog for monitoring, we are tracking the memory usage and sending the alert notinification to a slack channel. the monitoring config file is as below, i have added the formula for the numerator_metric. but for the demominator_metric, the value should be 1700, but i cant add the value directly over there instead i need to create a metric that always returns 1700,
{
"type": "metric_ratio",
"comparison": ">",
"numerator_metric": "aws.lambda.enhanced.max_memory_used{functionname:sample-function}",
"denominator_metric": "",
"warning_threshold": 70,
"target_threshold": 75,
"description": "Increased memory usage [prod]",
"monitor_timeframe": "1h",
"slo_timeframe": "7d",
"name": "Memory Usage",
"notify":[
{ "platform": "slack", "reference": "slack-channel" }
]
}

Bug? Using a data transform with date/timeunit

For a while now I've been using a flatten transform to move data from a columnar data frame to a Vega Lite spec, and wonder if it's the source of a recent problem I've encountered with a time series. The time series spec is in a gist on the vega editor, along with a similar pattern plotting sin and cos that uses flatten without a problem.
As near as I can tell, the spx time series should work with this transform, and it does work if I flatten the data myself. I'm out of ideas; maybe there's something odd with flatten and date/times? I've tried every combination of type specification, timeUnit, etc., I can think of but nothing seems to work.
Can anyone spot an error in the spx time series gist?
For some reason, your dates are not getting parsed properly. It may be a bug but you can easily workaround it by using a calculate to convert your date to a proper date.
Editor
{
"transform": [
{"flatten": ["spxdate", "open", "high", "low", "close"]},
{"calculate": "toDate(datum.spxdate)", "as": "new"}
],
"data": {
"values": {
"spxdate": [
"1990-01-02",
"1990-01-03",
"1990-01-04",
"1990-01-05",
"1990-01-08",
"1990-01-09",
"1990-01-10",
"1990-01-11",
"1990-01-12",
"1990-01-15",
"1990-01-16",
"1990-01-17",
"1990-01-18",
"1990-01-19",
"1990-01-22",
"1990-01-23",
"1990-01-24",
"1990-01-25",
"1990-01-26",
"1990-01-29",
"1990-01-30",
"1990-01-31",
"1990-02-01",
"1990-02-02",
"1990-02-05",
"1990-02-06",
"1990-02-07",
"1990-02-08",
"1990-02-09",
"1990-02-12",
"1990-02-13",
"1990-02-14",
"1990-02-15",
"1990-02-16",
"1990-02-20",
"1990-02-21",
"1990-02-22",
"1990-02-23",
"1990-02-26",
"1990-02-27",
"1990-02-28",
"1990-03-01",
"1990-03-02",
"1990-03-05",
"1990-03-06",
"1990-03-07",
"1990-03-08",
"1990-03-09",
"1990-03-12",
"1990-03-13",
"1990-03-14",
"1990-03-15",
"1990-03-16",
"1990-03-19",
"1990-03-20",
"1990-03-21",
"1990-03-22",
"1990-03-23",
"1990-03-26",
"1990-03-27",
"1990-03-28",
"1990-03-29",
"1990-03-30",
"1990-04-02",
"1990-04-03",
"1990-04-04",
"1990-04-05",
"1990-04-06",
"1990-04-09",
"1990-04-10",
"1990-04-11",
"1990-04-12",
"1990-04-16",
"1990-04-17",
"1990-04-18",
"1990-04-19",
"1990-04-20",
"1990-04-23",
"1990-04-24",
"1990-04-25",
"1990-04-26",
"1990-04-27",
"1990-04-30",
"1990-05-01",
"1990-05-02",
"1990-05-03",
"1990-05-04",
"1990-05-07",
"1990-05-08",
"1990-05-09",
"1990-05-10",
"1990-05-11",
"1990-05-14",
"1990-05-15",
"1990-05-16",
"1990-05-17",
"1990-05-18",
"1990-05-21",
"1990-05-22",
"1990-05-23"
],
"open": [
353.3999938964844,
359.69000244140625,
358.760009765625,
355.6700134277344,
352.20001220703125,
353.8299865722656,
349.6199951171875,
347.30999755859375,
348.5299987792969,
339.92999267578125,
337,
340.7699890136719,
337.3999938964844,
338.19000244140625,
339.1400146484375,
330.3800048828125,
331.6099853515625,
330.260009765625,
326.0899963378906,
325.79998779296875,
325.20001220703125,
322.9800109863281,
329.0799865722656,
328.7900085449219,
330.9200134277344,
331.8500061035156,
329.6600036621094,
333.75,
333.0199890136719,
333.6199951171875,
330.0799865722656,
331.0199890136719,
332.010009765625,
334.8900146484375,
332.7200012207031,
327.9100036621094,
327.6700134277344,
325.70001220703125,
324.1600036621094,
328.67999267578125,
330.260009765625,
331.8900146484375,
332.739990234375,
335.5400085449219,
333.739990234375,
337.92999267578125,
336.95001220703125,
340.1199951171875,
337.92999267578125,
338.6700134277344,
336,
336.8699951171875,
338.07000732421875,
341.9100036621094,
343.5299987792969,
341.57000732421875,
339.739990234375,
335.69000244140625,
337.2200012207031,
337.6300048828125,
341.5,
342,
340.7900085449219,
339.94000244140625,
338.70001220703125,
343.6400146484375,
341.0899963378906,
340.7300109863281,
340.0799865722656,
341.3699951171875,
342.07000732421875,
341.9200134277344,
344.3399963378906,
344.739990234375,
344.67999267578125,
340.7200012207031,
338.0899963378906,
335.1199951171875,
331.04998779296875,
330.3599853515625,
332.0299987792969,
332.9200134277344,
329.1099853515625,
330.79998779296875,
332.25,
334.4800109863281,
335.5799865722656,
338.3900146484375,
340.5299987792969,
342.010009765625,
342.8699951171875,
343.82000732421875,
352,
354.75,
354.2699890136719,
354,
354.4700012207031,
354.6400146484375,
358,
358.42999267578125
],
"high": [
359.69000244140625,
360.5899963378906,
358.760009765625,
355.6700134277344,
354.239990234375,
354.1700134277344,
349.6199951171875,
350.1400146484375,
348.5299987792969,
339.94000244140625,
340.75,
342.010009765625,
338.3800048828125,
340.4800109863281,
339.9599914550781,
332.760009765625,
331.7099914550781,
332.3299865722656,
328.5799865722656,
327.30999755859375,
325.7300109863281,
329.0799865722656,
329.8599853515625,
332.1000061035156,
332.1600036621094,
331.8599853515625,
333.760009765625,
336.0899963378906,
334.6000061035156,
333.6199951171875,
331.6099853515625,
333.20001220703125,
335.2099914550781,
335.6400146484375,
332.7200012207031,
328.1700134277344,
330.9800109863281,
326.1499938964844,
328.6700134277344,
331.94000244140625,
333.4800109863281,
334.3999938964844,
335.5400085449219,
336.3800048828125,
337.92999267578125,
338.8399963378906,
340.6600036621094,
340.2699890136719,
339.0799865722656,
338.6700134277344,
337.6300048828125,
338.9100036621094,
341.9100036621094,
343.760009765625,
344.489990234375,
342.3399963378906,
339.7699890136719,
337.5799865722656,
339.739990234375,
341.5,
342.5799865722656,
342.07000732421875,
341.4100036621094,
339.94000244140625,
343.760009765625,
344.1199951171875,
342.8500061035156,
341.7300109863281,
341.8299865722656,
342.4100036621094,
343,
344.7900085449219,
347.29998779296875,
345.19000244140625,
345.3299865722656,
340.7200012207031,
338.5199890136719,
335.1199951171875,
332.9700012207031,
332.739990234375,
333.760009765625,
333.57000732421875,
331.30999755859375,
332.8299865722656,
334.4800109863281,
337.0199890136719,
338.4599914550781,
341.07000732421875,
342.0299987792969,
343.0799865722656,
344.9800109863281,
352.30999755859375,
358.4100036621094,
355.0899963378906,
354.67999267578125,
356.9200134277344,
354.6400146484375,
359.07000732421875,
360.5,
359.2900085449219
],
"low": [
351.9800109863281,
357.8900146484375,
352.8900146484375,
351.3500061035156,
350.5400085449219,
349.6099853515625,
344.32000732421875,
347.30999755859375,
339.489990234375,
336.57000732421875,
333.3699951171875,
336.260009765625,
333.9800109863281,
338.19000244140625,
330.2799987792969,
328.6700134277344,
324.1700134277344,
325.3299865722656,
321.44000244140625,
321.7900085449219,
319.8299865722656,
322.9800109863281,
327.760009765625,
328.0899963378906,
330.45001220703125,
328.20001220703125,
326.54998779296875,
332,
332.4100036621094,
329.9700012207031,
327.9200134277344,
330.6400146484375,
331.6099853515625,
332.4200134277344,
326.260009765625,
324.4700012207031,
325.70001220703125,
322.1000061035156,
323.9800109863281,
328.4700012207031,
330.1600036621094,
331.0799865722656,
332.7200012207031,
333.489990234375,
333.57000732421875,
336.3299865722656,
336.95001220703125,
336.8399963378906,
336.1400146484375,
335.3599853515625,
334.92999267578125,
336.8699951171875,
338.07000732421875,
339.1199951171875,
340.8699951171875,
339.55999755859375,
333.6199951171875,
335.69000244140625,
337.2200012207031,
337.0299987792969,
340.6000061035156,
339.7699890136719,
338.2099914550781,
336.3299865722656,
338.70001220703125,
340.3999938964844,
340.6300048828125,
338.94000244140625,
339.8800048828125,
340.6199951171875,
341.260009765625,
341.9100036621094,
344.1000061035156,
342.05999755859375,
340.1099853515625,
337.5899963378906,
333.4100036621094,
330.0899963378906,
329.7099914550781,
330.3599853515625,
330.6700134277344,
328.7099914550781,
327.760009765625,
330.79998779296875,
332.1499938964844,
334.4700012207031,
335.1700134277344,
338.1099853515625,
340.1700134277344,
340.8999938964844,
342.7699890136719,
343.82000732421875,
351.95001220703125,
352.8399963378906,
351.95001220703125,
354,
352.5199890136719,
353.7799987792969,
356.0899963378906,
356.989990234375
],
"close": [
359.69000244140625,
358.760009765625,
355.6700134277344,
352.20001220703125,
353.7900085449219,
349.6199951171875,
347.30999755859375,
348.5299987792969,
339.92999267578125,
337,
340.75,
337.3999938964844,
338.19000244140625,
339.1499938964844,
330.3800048828125,
331.6099853515625,
330.260009765625,
326.0799865722656,
325.79998779296875,
325.20001220703125,
322.9800109863281,
329.0799865722656,
328.7900085449219,
330.9200134277344,
331.8500061035156,
329.6600036621094,
333.75,
332.9599914550781,
333.6199951171875,
330.0799865722656,
331.0199890136719,
332.010009765625,
334.8900146484375,
332.7200012207031,
327.989990234375,
327.6700134277344,
325.70001220703125,
324.1499938964844,
328.6700134277344,
330.260009765625,
331.8900146484375,
332.739990234375,
335.5400085449219,
333.739990234375,
337.92999267578125,
336.95001220703125,
340.2699890136719,
337.92999267578125,
338.6700134277344,
336,
336.8699951171875,
338.07000732421875,
341.9100036621094,
343.5299987792969,
341.57000732421875,
339.739990234375,
335.69000244140625,
337.2200012207031,
337.6300048828125,
341.5,
342,
340.7900085449219,
339.94000244140625,
338.70001220703125,
343.6400146484375,
341.0899963378906,
340.7300109863281,
340.0799865722656,
341.3699951171875,
342.07000732421875,
341.9200134277344,
344.3399963378906,
344.739990234375,
344.67999267578125,
340.7200012207031,
338.0899963378906,
335.1199951171875,
331.04998779296875,
330.3599853515625,
332.0299987792969,
332.9200134277344,
329.1099853515625,
330.79998779296875,
332.25,
334.4800109863281,
335.57000732421875,
338.3900146484375,
340.5299987792969,
342.010009765625,
342.8599853515625,
343.82000732421875,
352,
354.75,
354.2799987792969,
354,
354.4700012207031,
354.6400146484375,
358,
358.42999267578125,
359.2900085449219
]
}
},
"$schema": "https://vega.github.io/schema/vega-lite/v5.json",
"height": 400,
"width": 600,
"mark": "line",
"encoding": {
"x": {"field": "new", "type": "temporal", "timeUnit": "yearmonthdate"},
"y": {"field": "close", "type": "quantitative"}
}
}

Is there a way to use OCR to extract specific data from a CAD technical drawing?

I'm trying to use OCR to extract only the base dimensions of a CAD model, but there are other associative dimensions that I don't need (like angles, length from baseline to hole, etc). Here is an example of a technical drawing. (The numbers in red circles are the base dimensions, the rest in purple highlights are the ones to ignore.) How can I tell my program to extract only the base dimensions (the height, length, and width of a block before it goes through the CNC)?
The issue is that the drawings I get are not in a specific format, so I can't tell the OCR where the dimensions are. It has to figure out on its own contextually.
Should I train the program through machine learning by running several iterations and correcting it? If so, what methods are there? The only thing I can think of are Opencv cascade classifiers.
Or are there other methods to solving this problem?
Sorry for the long post. Thanks.
I feel you... it's a very tricky problem, and we spent the last 3 years finding a solution for it. Forgive me for mentioning the own solution, but it will certainly solve your problem: pip install werk24
from werk24 import Hook, W24AskVariantMeasures
from werk24.models.techread import W24TechreadMessage
from werk24.utils import w24_read_sync
from . import get_drawing_bytes # define your own
def recv_measures(message: W24TechreadMessage) -> None:
for cur_measure in message.payload_dict.get('measures'):
print(cur_measure)
if __name__ == "__main__":
# define what information you want to receive from the API
# and what shall be done when the info is available.
hooks = [Hook(ask=W24AskVariantMeasures(), function=recv_measures)]
# submit the request to the Werk24 API
w24_read_sync(get_drawing_bytes(), hooks)
In your example it will return for example the following measure
{
"position": <STRIPPED>
"label": {
"blurb": "ø30 H7 +0.0210/0",
"quantity": 1,
"size": {
"blurb": "30",
"size_type":" "DIAMETER",
"nominal_size": "30.0",
},
"unit": "MILLIMETER",
"size_tolerance": {
"toleration_type": "FIT_SIZE_ISO",
"blurb": "H7",
"deviation_lower": "0.0",
"deviation_upper": "0.0210",
"fundamental_deviation": "H",
"tolerance_grade": {
"grade":7,
"warnings":[]
},
"thread": null,
"chamfer": null,
"depth":null,
"test_dimension": null,
},
"warnings": [],
"confidence": 0.98810
}
or for a GD&T
{
"position": <STRIPPED>,
"frame": {
"blurb": "[⟂|0.05|A]",
"characteristic": "⟂",
"zone_shape": null,
"zone_value": {
"blurb": "0.05",
"width_min": 0.05,
"width_max": null,
"extend_quantity": null,
"extend_shape": null,
"extend": null,
"extend_angle": null
},
"zone_combinations": [],
"zone_offset": null,
"zone_constraint": null,
"feature_filter": null,
"feature_associated": null,
"feature_derived": null,
"reference_association": null,
"reference_parameter": null,
"material_condition": null,
"state": null,
"data": [
{
"blurb": "A"
}
]
}
}
Check the documentation on Werk24 for details.
Although a managed offering, Mixpeek is one free option:
pip install mixpeek
from mixpeek import Mixpeek
mix = Mixpeek(
api_key="my-api-key"
)
mix.upload(file_name="design_spec.dwg", file_path="s3://design_spec_1.dwg")
This /upload endpoint will extract the contents of your DWG file, then when you search for terms it will include the file_path so you can render it in your HTML.
Behind the scenes it uses the open source LibreDWG library to run a number of AutoCAD native commands such as DATAEXTRACTION.
Now you can search for a term and the relevant DWG file (in addition to the context in which it exists) will be returned:
mix.search(query="retainer", include_context=True)
[
{
"file_id": "6377c98b3c4f239f17663d79",
"filename": "design_spec.dwg",
"context": [
{
"texts": [
{
"type": "text",
"value": "DV-34-"
},
{
"type": "hit",
"value": "RETAINER"
},
{
"type": "text",
"value": "."
}
]
}
],
"importance": "100%",
"static_file_url": "s3://design_spec_1.dwg"
}
]
More documentation here: https://docs.mixpeek.com/

How to get the value from a List using Rest Assured?

I am trying to grab the first value of courseNumber for studenId=123 using Rest Assured.
When I using
.body("students.courseList.courseNumber[0]",equalTo(1000000000))
I am getting:
Expected: <1000000000>
Actual: [1000000000, 1000000001, 1000000002, 1000000003, ...........]
There are more than 10 courseList for studentId 123.
Also is it possible to use regex to grab a particular element from JSON Response or how do I get the path of an element when I have a few thousand lines of JSON Response.
Sample Response -
{
"students": [
{
"studentId": "ABC",
"studentName": "Abcd Abcd",
"courseDescription": "AAJSHKJASSJAK LASNLKASA KJk;;K K;;KK;K;KL;K;",
"creditRemaining": 100,
"classStartDate": "20191220"
},
{
"studentId": "123",
"studentName": "DEFG, VBNH",
"courseDescription": "AAJSHKJASSJAK LASNLKASA KJk;;K K;;KK;K;KL;K;",
"classSchedule": 2,
"classStartDate": "20191220",
"slotsRemaining": 10,
"courseList": [
{
"courseNumber": 1000000000,
"courseName": "Chemistry",
"courseInstructor": "HGJ IOUIOU"
"courseCity": "New York",
"courseLevel": 100,
"description": "GJKJLKJLafgdhgf ljkllklk klyiyy mnbbnkljlkj
yttiuyuyuyoyo
jhlkjkljkl"},
{
"courseNumber": 1000000001,
"courseName": "History",
"courseInstructor": "HGJ IOUIOU"
"courseCity": "New York",
"courseLevel": 100,
"description": "GJKJLKJLafgdhgf ljkllklk klyiyy mnbbnkljlkj yttiuyuyuyoyo
jhlkjkljkl"},
]
}
students.courseList.courseNumber[0][0] --> will give 1000000000
students.courseList.courseNumber[0][1] --> will give 1000000001

i want to fetch product_description but it contain html so i m getting confuse that how to get proper format from it

JSON:
{ "wishlist": [ { "wishlist": 0 } ], "cart": [ { "cart": 1 } ], "product": [ { "promo_id": 0, "avals": 0, "dis": null, "mp_product_id": 252, "mp_category_id": 113, "product_name": "Pink Soft Net Fabric Kids Angel Lehenga Choli", "product_description": "
This Pink Coloured Traditional Soft Net Fabric Lehenga Choli gives a beautifull look to your child. This Outfit come with Brocket Fabric Lehenga and Top has Soft Net Fabric with Silk Lining come along with Soft Net Dupatta .\r\n\r\n
You can make your kids wear this outfit for parties and functions.\r\n\r\n
Type :\r\n\r\n
*Semi-Stitched*\r\n\r\n
FABRIC :\r\n\r\n
Top : Unstitched Designer Brocade fabric
\r\nBottom : stitched Soft Net fabric
\r\nDupatta : Soft Net fabric
\r\nInner : Silk fabric\r\n\r\n
Size Chart :\r\n\r\n
1 to 5 year : 30 inches
\r\n6 to 8 year : 32 inches
\r\n9 to 10 year : 34 inches
\r\n10 to 15 year: 36 inches\r\n\r\n
Care
\r\nDry Clean\r\n", "sku_number": "Angel_3_Pink", "qty": 25, "likes_count": 0, "list_price": 2082, "selling_price": 1249, "discount": 41 } ], "image": [ { "image_name": "Sweet Angel Vol3-Pink.jpg" } ], "variant": [ { "Color": "PINK", "Size": "S,M,L,XL", "Occasion": "Party" } ], "related": [ { "mp_product_id": 231, "mp_category_id": 113, "product_name": "White Peacock Kids Indo Western ", "product_description": "
This White Coloured Traditional Banglory Top Fabrics Indo Western gives a beautifull look to your child. This Outfit come with Paper Silk Fabric Lehenga and Top has Banglory Fabric.\r\n\r\n
You can make your kids wear this outfit for parties and functions.\r\n\r\n
Type :\r\n\r\n
*Stitched*\r\n\r\n
FABRIC :\r\n\r\n
Top - Banglory (foam seat work),\r\n\r\n
Lehenga - Paper silk,\r\n\r\n
Size Chart :\r\n\r\n
6 to 12 year : 34 inches\r\n\r\n
Care
\r\nDry Clean\r\n", "product_image": "", "seller_product_code": "White_Peacock", "system_product_code": 0, "sku_number": "White_Peacock", "status": "A", "is_features": 0, "list_price": 2271, "selling_price": 1249, "qty": 25, "weight": "700", "cod_charge": "0", "shipping_charge": "0", "likes_count": 0, "create_date": "2017-06-26 12:38:51", "modify_date": "2017-06-26 12:39:37", "main_order": 14, "set_order": 0, "image_name": "Peacock White Kids \u00a0--- CKL 216 --- Rs. 625.jpg" }
Yes you are doing the same thing, but your code is prone to crashes as you are forcefully trying to unwrap optionals. Try this instead
Alamofire.request(APIProductDetail, method: .get, parameters: params, encoding: URLEncoding.default, headers: nil).responseJSON { (response) in
switch response.result {
case .success(let responseResultValue):
if let responseResult = responseResultValue as? [String:Any] {
if let productsList = responseResult["product"] as? [[String:Any]] {
for product in productsList {
if let productDescription = product["product_description"] as? String {
print(productDescription) //Here you will get the product description
}
}
}
}
case .failure(let error):
//handle any error here
print(error.localizedDescription)
}
}

Resources